Conférence co-organisée par Aude Bandini (UdeM) et Ulf Hlobil (Concordia) sur le thème de la connaissance dans un monde digital, qui sera tenue dans le cadre du Congrès annuel de la Canadian Society for Epistemology, du 14 au 16 novembre 2019 à Concordia University.
DEADLINE FOR SUBMISSION: August 1st, 2019
The digital age poses new challenges for epistemology. Digital technologies have become central to how we form, revise, and maintain our beliefs. How should we approach this recent development as epistemologists? What is the epistemological significance of our increasing reliance on, e.g., anonymous online sources, social media, personalized news feeds and search engines? What does the widespread use of AI and opaque algorithms mean for our lives as knowers, testifiers, and reasoners? Do new epistemic responsibilities arise in the digital world? How can we, as epistemologists, contribute to making sense of these developments?
One thing we can do is help identify the epistemic risks associated with these technological trends. As some have already noted, technologies using AI and opaque algorithms might very well, e.g., perpetuate and accentuate biases against marginalized groups, promote epistemic bubbles and echo chambers, help the spread of toxic misinformation (propaganda, hoaxes, conspiracy theories, “fake” news, “deepfakes”), and produce outputs that lack justification. Some of these risks constitute obstacles to acquiring knowledge or justified beliefs about important matters. Others may constitute or perpetuate various forms of epistemic injustice. Epistemic injustices may, e.g., be present in labeled data sets that are used to train artificial neural networks.
This conference is devoted to the epistemic and related ethical implications of digital technology. We invite abstract submissions on any aspect of this broad topic.
Karen Frost-Arnold (Hobart and William Smith Colleges) J. Adam Carter (University of Glasgow)
This two-day conference invites contributions from epistemologists dealing with any of (but not limited to) the following topics:
- When and how can we gain knowledge by using digital technologies that involve opaque algorithms?
- Is there a risk that we are building systemic epistemic injustice into our digital technology?
- Can AI commit epistemic injustices?
- Does the digital age call for new belief forming methods or new ways of assessing belief forming methods, e.g., new methods for assessing the epistemic quality of search results?
- How do biases, luck, and opacity in digital technology affect understanding in contrast to knowledge?
- How should we use big data in our scientific theorizing? How should we address the risk of overfitting our data?
- Which epistemic virtues are particularly important in the digital age?
- Do we need stricter accountability mechanisms for internet speech? Or would that undermine key epistemic advantages of internet speech, e.g., the possibility that members of marginalized groups to speak anonymously?
- If believing automated online sources, such as Google, a form of accepting testimony? If so, is this testimonial knowledge different from other forms of testimonial knowledge?
- What are the epistemic risks (or potential benefits) of personalized search engines, news feeds, software suggestions, etc.?
- What are our epistemic obligations when what evidence we gather is heavily influences by our past preferences for certain information?
Long abstracts (800 words max) should be prepared for double-blind reviewing process in pdf format. They should be submitted via EasyChair. In addition, a short abstract (100 word max) should be copied and pasted into the EasyChair form.
Deadline: August 1st, 2019
Notification of acceptance: September 1st, 2019
If you have any further queries, please contact:
Aude Bandini: aude.bandini[at]umontreal.ca