Reset Our Future Fund

Deadline:
1st November 2020

Surveillance capitalism harms democracy, human rights online, and open societies. This open call is our primary way to support organisations and people using technology, research, and education that help tackle these challenges.

Profile

Through this open call, Reset strives to uphold and increase the capacity of individuals, organisations, and companies to address the harms of surveillance capitalism on human rights, democracy, and open societies. This open call will support technology-focused interventions, with clear human-centred benefits, that aim to advance the public’s understanding and awareness of surveillance capitalism and its harms; and to strengthen society's ability - as individuals, citizens, consumers, and users - to mitigate and defend against those harms. Through the efforts supported, we aim to reaffirm the internet’s priority of promoting healthy and thriving global democracies over profits. We believe better more virtuous technology-centred interventions can reduce existing harms and ensure healthier more democratic societies.

Project themes & ideas

The following lists are not comprehensive, will evolve, and are merely some ideas to help applicants know if they are in the right place. Feel free to apply even if you do not see your proposed activity and definitely if you see something similar to your own work.

Commonly supported activities

Software development

  • Creating new openly licensed technologies that fill a current need for affected people;
  • Improving the security, usability, feature-set and adoptability of existing technologies.
  • Misinformation protection tools and techniques;
  • Privacy-enhancing technologies defending against data extraction;
  • Content authentication and validation techniques;
  • Apps or platforms to illuminate what data is being collected/extracted from people;
  • Ongoing support of other crucial technology/tools;
  • Creation or sustaining of alternate infrastructure providers or other underlying technology structures currently dominated by offending companies;
  • Apps or services that allow for data portability;
  • Apps that support alternative business models for affected creators; and,
  • Developing alternate content redistribution or support methods that do not perpetuate the status quo.

Technical analysis and research

  • Providing new or deeper insights into the challenges of affected communities that ultimately contribute to the improvement of technological solutions;
  • Emphasising applied research;
  • Conducting research focused on real-time monitoring and analysis of both the technical and political threats from surveillance capitalism;
  • Reverse-engineering/black-box audits of algorithms and prediction markets;
  • Tracking and attribution of dark money in and out of critical democratic processes;
  • Forensics, detection, attribution, and effect measurement of misinformation campaigns;
  • Automated tracking of relevant government and corporate laws, policies, and directives trends;
  • Engagement at international standards bodies to participate in the creation and modification of relevant specifications; and
  • Working with lawyers and/or civil society members to provide evidence for strategic litigation and/or campaigns.

Publication of reports, websites, or other awareness-raising activities

  • Facilitating the ability for targeted communities to increase their resiliency to threats and harms of surveillance capitalism;
  • How-to-guides or instructional-apps;
  • Early education of young people; and
  • Incorporating collaborative partnerships with other organisations and/or individuals within the field of digital rights, related fields and communities, or their respective area of focus.

Ideas

Crackdown on fake engagements

What could be done to encourage platforms to address posts, pages, videos, etc that promote the purchase of fake engagements? Could ads be purchased for users searching for “fake followers” or “how to buy fake followers, likes, and views” that pops up to remind them that this behaviour is a violation of terms of service and may be illegal in many countries? Could a bot itself reply or engage with users interacting with content promoting fake engagements to let them know this is bad behaviour?

Authentication of media via provenance

With advances in AI’s ability to manipulate or generate visual, audio and written content with a high potential to deceive, how do we trust what we see, hear and read is genuine? The proliferation of these capabilities could erode media trust and with it democracies around the world by enabling widespread distribution of false information to billions of individuals via social media platforms. What front-line tradecraft and newsroom technologies are needed for both industry and independent media to ensure the authentication of media from source to the public?

Learn more:

Junk site classifier competition

What novel or underutilised approaches could be employed to produce better classifiers that spot junk sites faster? Could a competition engage multiple efforts to increase our ability to identify these sites, resulting in their de-platforming or cutting them off from advertising revenue? Could a bot be used to also notify users that shared their content? Finally, could this solution be adopted by platforms themselves?

Reverse engineering recommendation engines

Why does YouTube’s recommendation engine continue to lead people down bizarre, sometimes dangerous, and unwanted pathways? What evidence-driven approaches could be applied to learn more about how the role of personalisation (our personal information) plays for individual and aggregate user experiences? Could a browser extension or mobile app capture and report safely individuals “next up” videos starting with any given “seed video” for 100 videos deep and provide basic annotation about each of the next 99 videos? What actions should YouTube and other similar platforms be providing users to avoid unwanted pathways?

Learn more:

Misinformation evidence collection

Online advertisement transparency

Learn more:

Data trust(s) for the digital evidence of harm

Learn more:

Things to avoid

  • Testing and/or collection of end-user data that violate established ethical principles.

Important considerations

  • Projects should exhibit originality, substance, precision and relevance to our aims. Objectives should be ambitious, yet measurable and achievable with activities and milestones listed monthly. The overall project goals should extend beyond traditional audiences.
  • For the duration of fellowships, the fellow will be expected to work with Reset.
  • Any individual will need to acquire the appropriate work authorisation. For instance, if a student with an I-20 visa intends to carry out their project in the United States, they will need to apply to use Curricular Practical Training for their fellowship.
  • Please be sure to review all of the general open call information, Our Guide to Open Calls, and the data policy for applicants.

Deadline:
1st November 2020