If we want to make smart decisions about the opportunities of digital civil society, we need some guiding values and principles. We know from the pace layering diagram that the outermost layers—technology and business models—will churn rapidly. For guidance, we need to go a level or two deeper—to governance and infrastructure.
The definition of digital civil society gives us a starting point. The definition contains three key elements: voluntary action, private resources, and public benefit. Each one contains a set of values that need to be applied to digital resources:
● Voluntary action requires that individuals participate by choice and that they opt in. They also need to be able to easily opt out when they so choose. True voluntary action in digital spaces is going to require consent processes that recognize the decision-making authority, choice, and intent of individuals, not the preferences or business motives of the organization.
● Private resources require that we see the individual as the “owner” of the resource. He or she must be in charge of providing the information, be responsible for its content, and have input and recourse over how it is used. We also need to make sure we don’t harm the individual by collecting his or her data. In today’s online environment, the less data collected, the safer the individual. As civil society organizations collect data from people, a good rule of thumb is to gather as little data as is viably possible. The vulnerability of online data suggests that we “don’t collect what we can’t protect.”
It's important to note this runs counter to the rhetoric and practices of most businesses and some governments. It should not be surprising that civil society’s core values would stand apart. It’s time to bring our practices into line with our values and not those of the software vendors or infrastructure providers.
● Public benefit refers to the intended purpose or outcome of the action. What good can we create, using the contributed resources, either that we can't create alone or that the broader public isn’t committed to making happen? To be public, these benefits need to accrue beyond any one individual who commits his or her private resources. As such, we should be committed to sharing what we’ve learned and inviting others to build on our work.
These basic premises give us a starting point for shaping the safe, ethical, and effective use of digital resources for good. Consent matters. Clear rules for how something is owned and shared need to be developed. Protecting the privacy of individuals is important. And broad benefit should be the goal. Translated into digital parlance, these values suggest the practices to be prioritized, created, and improved upon:
● Voluntary = consent practices
● Private = ownership, security, due process, and recourse
● Public = open and reusable
These ideas offer three starting principles for using digital data ethically, safely, and effectively.
● First, consent. Voluntary participation means that informed, active consent is a prerequisite, as are practices that make it easy to withdraw participation (and retract or destroy data). Consent alone is not sufficient, because of the derivative and persistent nature of digital data, and because many of us don’t really have choices in what services we can use, but it is a starting point.
● Second, privacy. Protection of the private individual—and respect for her autonomy at all times—requires the sector to place a high value on her privacy. Given the (poor) state of digital security, the option here is to collect as little data as possible and to be creative (and privacy-minded) about what is collected. Take an approach of minimum viable data collection. Nonprofits can’t adequately protect people’s private data, and they rarely have the capacity to “go make sense of it later.” For nonprofits, the marketing-driven zeitgeist that more data is better is rarely going to be true. This is where it helps to see data as a liability.
● Finally, default to openness. The pursuit of public benefit leads to the third principle, which is a default to openness. This is only possible if in fact the first two principles—consent and minimum viable data collection—have been enacted. Only then is it appropriate for data to be shared in ways that can advance the change we seek. Similarly, knowing that you expect to open up the final product calls for the development of robust consent and privacy practices at the beginning.
It’s important to see the alignment across these principles. Data that are voluntarily contributed, well protected, and stored with close attention to the individuals’ privacy are positioned to be shared. Robust consent and privacy practices are (or should be) prerequisites for openness.
Takeaways are critical, bite-sized resources either excerpted from our guides or written by GrantCraft using the guide's research data or themes post-publication. Attribution is given if the takeaway is a quotation.
This takeaway was derived from Philanthropy and the Social Economy: Blueprint 2016.