Summary
The spread of disinformation is a serious problem that impacts social structure and threatens democracies worldwide. Citizens increasingly rely on (dis)information available online, either somewhat passively, through social media feeds, or actively, by using search engines and specific websites. In both scenarios, algorithms filter and select displayed information according to the users’ past preferences. There is a real risk that algorithms might reinforce the user’s beliefs and create (dis)information bubbles, by offering less divergent views, or even directing them to low-credibility content. For these reasons, serious efforts have been made to identify and remove “fake-news” websites and minimize the spread of disinformation on social media, but we have not witnessed equivalent attempts to understand and curtail the role of search engines. FARE_AUDIT addresses this imbalance and offers an innovative tool to audit search engines that can be broadly used. It will help to 1) better understand how browsing history influences search engine results, particularly the likelihood of being directed to disinformation, 2) create a system that democracy-promoting institutions and concerned citizens can use to identify new disinformation, in near real-time, and 3) breach information bubbles by simulating how search results would be different if users had a different online profile. By relying on web-crawlers, our tool is privacy-protecting and does not require any real user data. Moreover, the proposed system anticipates the announced shift from cookie-tracking to fingerprinting and takes advantage of the expected small time overlap between both systems to learn from both and broaden its scope. Overall, we expect this novel tool to have a meaningful social impact by increasing public awareness of the role of search engines on disinformation spread, and by equipping organizations with a tool to detect and monitor disinformation, especially in political contexts.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101100653 |
Start date: | 01-12-2022 |
End date: | 31-05-2024 |
Total budget - Public funding: | - 150 000,00 Euro |
Cordis data
Original description
The spread of disinformation is a serious problem that impacts social structure and threatens democracies worldwide. Citizens increasingly rely on (dis)information available online, either somewhat passively, through social media feeds, or actively, by using search engines and specific websites. In both scenarios, algorithms filter and select displayed information according to the users’ past preferences. There is a real risk that algorithms might reinforce the user’s beliefs and create (dis)information bubbles, by offering less divergent views, or even directing them to low-credibility content. For these reasons, serious efforts have been made to identify and remove “fake-news” websites and minimize the spread of disinformation on social media, but we have not witnessed equivalent attempts to understand and curtail the role of search engines. FARE_AUDIT addresses this imbalance and offers an innovative tool to audit search engines that can be broadly used. It will help to 1) better understand how browsing history influences search engine results, particularly the likelihood of being directed to disinformation, 2) create a system that democracy-promoting institutions and concerned citizens can use to identify new disinformation, in near real-time, and 3) breach information bubbles by simulating how search results would be different if users had a different online profile. By relying on web-crawlers, our tool is privacy-protecting and does not require any real user data. Moreover, the proposed system anticipates the announced shift from cookie-tracking to fingerprinting and takes advantage of the expected small time overlap between both systems to learn from both and broaden its scope. Overall, we expect this novel tool to have a meaningful social impact by increasing public awareness of the role of search engines on disinformation spread, and by equipping organizations with a tool to detect and monitor disinformation, especially in political contexts.Status
SIGNEDCall topic
ERC-2022-POC2Update Date
09-02-2023
Images
No images available.
Geographical location(s)
Structured mapping