All of us want to create inclusive, safe, and privacy-aware digital experiences, but where to begin? Our brand new Smashing Book, “Understanding Privacy,” written by Heather Burns, can help lay the ground for future developers, designers, and project managers to build a better web for tomorrow. Jump to the details or get the book right away.
Q. Did any part of your education require you to read the building regulations?
A. No, I don’t recall, but I don’t think so.
Q. What about fire safety of building materials?
A. No.
— Witness testimony, the Grenfell Tower inquiry, day 22
As the hard copies of Understanding Privacy begin to ship all over the world, the brilliant team at Smashing, who helped me bring my book to life, have asked me to share a few thoughts on what it might achieve in the months and years to come.
I wrote the book for two broad audiences, and if you’re reading this, you probably fall into one of them. The first audience is designers, developers, and project managers already working on the open web, either professionally or in side projects. The second audience is students and future professionals in those fields, whether they are in secondary schools, undergraduate courses, vocational training, or code academies.
The reason I structured the book in the way I did is that these two audiences, and you, tend to have something in common: you’ve never received any previous training or education on positive foundational privacy, either as a concept, a legal issue, or a professional practice, either in your formal education (assuming you had any) or in your workplaces.
“
That means that an entire generation of professionals like you have been introduced to privacy by being thrown into advanced legal compliance headaches or reactive fixes to the problems created by others, with no knowledge of the basic concepts and principles about what privacy actually is and how to achieve it.
I hope that Understanding Privacy can go some way towards giving people like you a confident understanding of those foundational concepts. Indeed, it’s my hope that teachers and educators will use it as the basis for a curriculum on privacy so that a healthy approach to privacy becomes baked-in from the start rather than retrofitted at the end.
That, of course, raises two further issues.
- The first is how teachers and educators find the book in the first place;
- The second is how the lessons in it reach developers (both current and future ones) who have never had a teacher and never will.
The question of how we teach privacy, what we teach, and where we teach it, has troubled me for years. That’s a hard enough problem to crack. After all, web development is an unorganized field. In fact, in the strictly legal sense, it is not a profession.
What do I mean by that? Professions are defined by industry-based organisations, common paths of entry, common educational requirements, continuing professional development, and even certifications that require refresher training every few years. Being a professional, in the strictly legal sense, means that there is a body made up of your peers who make sure that you bring a common body of knowledge to the work you put into the world and that the work you do meets externally verifiable standards.
Web development, on the other hand, is an unorganized and unstructured field that anyone can enter, at any time, with any form of formal training or with none at all, and without any external certifications or approval. A software engineer who went through a four-year bachelor’s degree program in computer science can be working on the same team, doing the same work, as a former airline pilot who learned code for fun. It’s that occupational diversity that has contributed to the growth of the open web as a whole; indeed, I find that the best teams contain people who approach their work from the diverging perspectives they gained doing something completely different.
But it also means that the knowledge of privacy that we bring to our work is, quite simply, all over the place. If that knowledge is there at all. And without a common pathway of education, whether that’s access to training and continuing professional development or work towards a standard of foundational knowledge, we will continue to bring the contrasting social, legal, and cultural differences to privacy which I discuss in Part One to the work we put into the world.
And our users will continue to pay the price for that.
As I write in the book, we can’t wait for educators, employers, and institutions to fill the gaps in our knowledge. Educating ourselves on privacy has never been more important. It pains me to know that the book is the first and only education on privacy that most of its readers will ever have, but something is better than nothing. Unfortunately, it won’t be enough.
Because the dilemma of how we teach a positive foundational approach to privacy — in an unorganized and open industry — has taken on a whole new urgency through my pivoted career into the politics of tech. And that situation is far scarier than you can imagine.
Much of my time in recent years has been spent retorting various regulatory plans for personal liability regimes in digital regulation. That means that politicians increasingly want to hold the people who make the open web legally and even criminally responsible for any misuse or unintended consequences of their work.
Some of this is born out of pressure to “do something” about the mess that the open web is today; sometimes it’s about “reining in the tech giants” (and I can tell you that politicians absolutely think Facebook is the Internet); sometimes it’s about barefaced moves for political power (hello from Brexit Britain); and sometimes it’s about cracking down on public discourse and interaction, delegating the requirement for censorship and control to the tech sector and therefore to workers like you.
Whatever reasons are behind these proposals, they are not going away. In fact, they’re only getting louder.
Many of these proposed liability regulations have been borrowed from traditional health and safety regimes. But these draft regulations, and those who support them, fail to understand that human discourse cannot be regulated as if it was fire-retardant cladding on a building that wasn’t fire-retardant at all (as I noted in the quote which began this article). By trying to shoehorn human interactions into a “risk assessment” model, these regimes risk creating an unworkable legal standard where a person who misuses a service is not deemed liable, but the person who built the service is.
These proposed liability regimes, for what it’s worth, have been drafted in a highly obsessive and vindictive manner to target a handful of high-profile American billionaires and celebrities in a handful of American big tech companies. (To be precise, these proposed regimes target three specific individuals in two companies, as if their arrests and imprisonment would fix all the problems on the Internet.) For the purposes of this discussion, those men’s guilt or complacency is neither here nor there.
That’s because, for a range of obvious reasons, once those laws are on the books, the celebrity billionaires will be able to afford to duck and dodge the charges. But because politicians insist that “something has to be done” and “someone needs to pay for this”, those laws will be used, instead, to go after the little guys and the easy targets. That means you.
What I am saying is that policymakers across all societies and cultures are turning their attention to people like you, the knowledge you bring to the table, and the work you put into the world. They’re not doing that because they want to support you into the next phase of your career. They’re doing that because they’re looking for someone to blame. They need someone to blame.
In fact, I have encountered politicians who are desperate to actually arrest, prosecute, and imprison developers, hopefully in front of the TV cameras, as punishment for the sins of their celebrity bosses. Those policymakers are in ascendancy, and they are not going away.
And when they’re looking for someone to blame for the problems on the open web today and need an easy target to take down for a quick political “win,” there you are, with no qualifications or foundational training or formal education, making things that millions of people use.
I think you see where this is going.
I wrote Understanding Privacy to contribute to a better open web, and I wrote it in the most positive and constructive tone possible (and hey, that was hard going in lockdown). But I would not be serving the people I wrote it for if I pretended that the book’s teachings exist in a happy bubble where the fixes are easy. The book’s teachings exist in a political climate where the people who make the web, including you, are now a target.
I want the book to contribute to a better standard of privacy for the people we build the web for. By reading the book, you’ll learn how to protect them in everything you do, regardless of the presence or absence of any privacy legislation. But in the political climate that exists around us all, by reading the book, you’ll learn how to protect yourself too.
In the absence of any formal curricular and educational path, workplace training, professional body, or legal standard, the book will help you to create an accountable and documented framework around privacy in your work — no matter who employs you or what you’re working on. That framework, and for that matter, the book, can’t protect you on its own. But the lessons you learn from it might just help you when the day comes that it’s you and your team in your co-working lounge, and not the celebrity billionaires and their teams in Silicon Valley, who become the target for an ambitious politician’s campaigning.
During life in lockdown, we all became familiar with the “oxygen mask” rule: secure your own mask before putting one onto someone else. In other words, you can’t support others if you’re not supporting yourself. As you use Understanding Privacy to build a better web for your users, take some time to think about the ways you can use its lessons to protect yourself, especially in light of policymakers’ obsession with getting ad-hominem revenge on Big Tech celebrities — an obsession which views you as expendable collateral damage. And as I write in Part Four, think about the developers who will come after you and what sort of world they can build if they are given a better education in foundational privacy.
No comments:
Post a Comment