The Reset Our Future Fund is Reset's primary way to support organisations and individuals using technology, research, and education to address the threat surveillance capitalism poses to democracy, human rights, and open societies.
Through this Open Call, Reset strives to uphold and increase society's capacity to understand and address the harms of surveillance capitalism. In doing so, the Reset Our Future Fund supports technology-focused interventions with clear human-centred benefits that advance the public’s ability as individuals, citizens, consumers, and users to mitigate and defend against those harms. Through the efforts supported, we seek to increase awareness and reaffirm the internet’s ability to promote people over profits. We ultimately believe better, more virtuous technology-centred interventions are the best way to help reduce current harms and ensure the existence of healthier, more democratic societies in the future.
Learn more about how we see the world and what we seek to accomplish here.
Project themes & ideas
Using one or more of the commonly supported activities listed below (or a related activity), applicants are invited to imagine a specific project or set of activities that will increase society's ability to understand, mitigate, and defend against the harms of surveillance capitalism.
Note: The following list is not comprehensive and will necessarily evolve over time. The activities and ideas provided below are intended to be illustrative and indicative of whether applicants are in the right place. Interested parties should feel free to apply even if their intended activity/project does not appear below. Remember, the Reset Our Future Fund focuses on supporting technology-focused interventions with clear human-centred benefits.
Commonly supported activities
Reset Our Future Fund projects often involve software development, technical analysis and research, and/or the publication of reports, websites, and other awareness-raising activities.
- Creating new openly licensed technologies that fill a current need for affected people.
- Improving the security, usability, feature-set, and adoptability of existing technologies.
- Misinformation protection tools and techniques.
- Privacy-enhancing technologies defending against data extraction.
- Content authentication and validation techniques.
- Apps or platforms to illuminate what data is being collected/extracted from users.
- Ongoing support of other crucial technology/tools.
- Creation or sustaining of alternate infrastructure providers or other underlying technology structures currently dominated by offending companies.
- Apps or services that allow for data portability.
- Apps that support alternative business models for affected creators.
- Developing alternate content redistribution or support methods that do not perpetuate the status quo.
Technical analysis and research
- Providing new or deeper insights into the challenges of affected communities that ultimately contribute to the improvement of technological solutions.
- Emphasising applied research.
- Conducting research focused on real-time monitoring and analysis of both the technical and political threats of surveillance capitalism.
- Reverse-engineering/black-box audits of algorithms and prediction markets.
- Tracking and attribution of dark money in and out of critical democratic processes.
- Forensics, detection, attribution, and effect measurement of misinformation campaigns.
- Automated tracking of relevant government and corporate laws, policies, and directives trends.
- Engagement at international standards bodies to participate in the creation and modification of relevant specifications.
- Working with lawyers and/or civil society members to provide evidence for strategic litigation and/or campaigns.
Publication of reports, websites, or other awareness-raising activities
- Facilitating the ability of targeted communities to increase their resiliency to threats and harms of surveillance capitalism.
- How-to-guides or instructional apps.
- Early education of young people.
- Incorporating collaborative partnerships with other organisations and/or individuals within the field of digital rights, related fields and communities, or their respective area of focus.
We welcome applications that address a host of different topics or themes, so long as they help advance our goals and operate within our four core areas of work. Sample project ideas for Reset Our Future Fund projects include (but are not limited to):
Crackdown on fake engagements: What could be done to encourage platforms to address posts, pages, and videos that promote the purchase of fake engagements? Could ads be purchased to pop up when users search for “fake followers” or “how to buy fake followers, likes, and views” and notify them that this behaviour is a violation of terms of service and may be illegal in many countries? Could a bot itself reply or engage with users interacting with content promoting fake engagements to let them know this is bad behaviour?
Authentication of media via provenance: How can we trust that what we see, hear, and read is genuine given the advances in AI’s ability to manipulate or generate visual, audio, and written content with a high potential to deceive? The proliferation of these capabilities could erode media trust and with it democracies around the world by enabling the widespread distribution of false information to billions of individuals via social media platforms. What frontline tradecraft and newsroom technologies are needed for both industry and independent media to ensure the authentication of media from source to the public?
Junk site classifier competition: What novel or underutilised approaches could be employed to produce better classifiers that are able to more quickly spot junk sites? Could a competition engage multiple efforts to increase our ability to identify these sites, ultimately resulting in these sites being de-platformed or cut off from advertising revenue? Could a bot be used to notify users that shared their content? Would it be possible for such a solution to be adopted by platforms themselves?
Reverse engineering recommendation engines: Why does YouTube’s recommendation engine continue to lead people down bizarre, sometimes dangerous, and unwanted pathways? What evidence-driven approaches could be applied to learn more about the role personalisation (our personal information) plays for individual and aggregate user experiences? Starting with any given “seed video” (and proceeding forward for 100 videos), could a browser extension or mobile app safely capture and report the “next up” videos and provide basic annotation about each of the next 99 videos? What actions should YouTube and other similar platforms provide users to avoid unwanted pathways?
Misinformation evidence collection
Online advertisement transparency
Data trust(s) for the digital evidence of harm
Things to avoid
- Testing and/or collection of end-user data that violate established ethical principles.
- Projects should exhibit originality, substance, precision, and relevance to our Goals, objectives, & areas of work. Specific project objectives should be ambitious, yet measurable and achievable (with anticipated activities and milestones projected by month). Overall project goals should extend beyond traditional audiences.
- All individuals must acquire the appropriate work authorisation. Applicants will need to secure their own visa and work permit (if applicable). For instance, if a student with an I-20 visa intends to carry out their project in the United States, they will need to apply to use Curricular Practical Training for their fellowship. We are happy to provide visa letters upon request.
- Prior to submitting an application, applicants should review all relevant resources, including Open Call information, Our Guide to Open Calls, and the applicable Data use policy for applicants.