Google’s Sundar Pichai: Privacy Should Not Be a Luxury Good
Yes, we use data to make products more helpful for everyone. But we also protect your information.
By Sundar Pichai
Mr. Pichai is the chief executive of Google.
CreditIllustration by Yoshi Sodeoka. Photograph by Max Whitaker.
MOUNTAIN VIEW, Calif. — Google products are designed to be helpful. They take the friction out of daily life (for example, by showing you the fastest route home at the end of a long day) and give you back time to spend on things you actually want to do. We feel privileged that billions of people trust products like Search, Chrome, Maps and Android to help them every day.
It’s a trust we match with a profound commitment to responsibility and a healthy dose of humility. Many words have been written about privacy over the past year, including in these pages. I believe it’s one of the most important topics of our time.
People today are rightly concerned about how their information is used and shared, yet they all define privacy in their own ways. I’ve seen this firsthand as I talk to people in different parts of the world. To the families using the internet through a shared device, privacy might mean privacy from one another. To the small-business owner who wants to start accepting credit card payments, privacy means keeping customer data secure. To the teenager sharing selfies, privacy could mean the ability to delete that data in the future.
Privacy is personal, which makes it even more vital for companies to give people clear, individual choices around how their data is used. Over the past 20 years, billions of people have trusted Google with questions they wouldn’t have asked their closest friends: How do you know if you’re in love? Why isn’t my baby sleeping? What is this weird rash on my arm? We’ve worked hard to continually earn that trust by providing accurate answers and keeping your questions private. We’ve stayed focused on the products and features that make privacy a reality — for everyone.
“For everyone” is a core philosophy for Google; it’s built into our mission to create products that are universally accessible and useful. That’s why Search works the same for everyone, whether you’re a professor at Harvard or a student in rural Indonesia. And it’s why we care just as much about the experience on low-cost phones in countries starting to come online as we do about the experience on high-end phones.
Our mission compels us to take the same approach to privacy. For us, that means privacy cannot be a luxury good offered only to people who can afford to buy premium products and services. Privacy must be equally available to everyone in the world.
Even in cases where we offer a paid product like YouTube Premium, which includes an ads-free experience, the regular version of YouTube has plenty of privacy controls built in. For example, we recently brought Incognito mode, the popular feature in Chrome that lets you browse the web without linking any activity to you, to YouTube. You can view YouTube as a logged-in user or in Incognito mode.
To make privacy real, we give you clear, meaningful choices around your data. All while staying true to two unequivocal policies: that Google will never sell any personal information to third parties; and that you get to decide how your information is used. Here’s how it works:
First, data makes the products and services you use more helpful to you. It’s what enables the Google Assistant to book a rental car for your trip, Maps to tell you how to navigate home and Photos to share vacation pictures with a click of a button.
Second, products use anonymous data in aggregate to be more helpful to everyone. Traffic data in Google Maps reduces gridlock by offering people alternate routes. Queries in Google Translate make translations more accurate for billions of people. Anonymized searches over time help Search understand your questions, even if you misspell them.
Third, a small subset of data helps serve ads that are relevant and that provide the revenue that keeps Google products free and accessible. That revenue also sustains a broad community of content creators, which in turn helps keep content on the web free for everyone. The data used in ads could be based on, for example, something you searched for or an online store you browsed in the past. It does not include the personal data in apps such as Docs or Gmail. Still, if receiving a customized ads experience isn’t helpful, you can turn it off. The choice is yours and we try to make it simple.
Eight years ago, we introduced an easy way to export all your data from Google services — and even take it elsewhere. A few years later, we created the Google Account page as a place to review and adjust all of your privacy controls. Nearly 20 million people now visit it every day. But we know our work here is never done, and we want to do more to stay ahead of user expectations.
Last week, we announced significant new privacy features, including one-click access to privacy settings from all our major products and auto-delete controlsthat allow you to choose how long you want data to be saved. And to protect your data from security threats, we just introduced a security key built into Android phones that can provide two-factor authentication.
[Technology has made our lives easier. But it also means that your data is no longer your own. We’ll examine who is hoarding your information — and give you a guide for what you can do about it. Sign up for our limited-run newsletter.]
We’re also working hard to challenge the assumption that products need more data to be more helpful. Data minimization is an important privacy principle for us, and we’re encouraged by advances developed by Google A.I. researchers called “federated learning.” It allows Google’s products to work better for everyone without collecting raw data from your device. Federated learning is how Google’s Keyboard can recognize and suggest new words like “YOLO” and “BTS” after thousands of people begin typing them — without Google ever seeing anything you type. In the future, A.I. will provide even more ways to make products more helpful with less data.
Even as we make privacy and security advances in our own products, we know the kind of privacy we all want as individuals relies on the collaboration and support of many institutions, like legislative bodies and consumer organizations.
Europe raised the bar for privacy laws around the world when it enacted the General Data Protection Regulation. We think the United States would benefit from adopting its own comprehensive privacy legislation and have urged Congress to pass a federal law. Ideally, privacy legislation would require all businesses to accept responsibility for the impact of their data processing in a way that creates consistent and universal protections for individuals and society as a whole.
Legislation will help us work toward ensuring that privacy protections are available to more people around the world. But we’re not waiting for it. We have a responsibility to lead. And we’ll do so in the same spirit we always have, by offering products that make privacy a reality for everyone.
Sundar Pichai is the chief executive of Google.
Follow @privacyproject on Twitter and The New York Times Opinion Section onFacebook and Instagram.
Yes, we use data to make products more helpful for everyone. But we also protect your information.
By Sundar Pichai
Mr. Pichai is the chief executive of Google.
- May 7, 2019
CreditIllustration by Yoshi Sodeoka. Photograph by Max Whitaker.
MOUNTAIN VIEW, Calif. — Google products are designed to be helpful. They take the friction out of daily life (for example, by showing you the fastest route home at the end of a long day) and give you back time to spend on things you actually want to do. We feel privileged that billions of people trust products like Search, Chrome, Maps and Android to help them every day.
It’s a trust we match with a profound commitment to responsibility and a healthy dose of humility. Many words have been written about privacy over the past year, including in these pages. I believe it’s one of the most important topics of our time.
People today are rightly concerned about how their information is used and shared, yet they all define privacy in their own ways. I’ve seen this firsthand as I talk to people in different parts of the world. To the families using the internet through a shared device, privacy might mean privacy from one another. To the small-business owner who wants to start accepting credit card payments, privacy means keeping customer data secure. To the teenager sharing selfies, privacy could mean the ability to delete that data in the future.
Privacy is personal, which makes it even more vital for companies to give people clear, individual choices around how their data is used. Over the past 20 years, billions of people have trusted Google with questions they wouldn’t have asked their closest friends: How do you know if you’re in love? Why isn’t my baby sleeping? What is this weird rash on my arm? We’ve worked hard to continually earn that trust by providing accurate answers and keeping your questions private. We’ve stayed focused on the products and features that make privacy a reality — for everyone.
“For everyone” is a core philosophy for Google; it’s built into our mission to create products that are universally accessible and useful. That’s why Search works the same for everyone, whether you’re a professor at Harvard or a student in rural Indonesia. And it’s why we care just as much about the experience on low-cost phones in countries starting to come online as we do about the experience on high-end phones.
Our mission compels us to take the same approach to privacy. For us, that means privacy cannot be a luxury good offered only to people who can afford to buy premium products and services. Privacy must be equally available to everyone in the world.
Even in cases where we offer a paid product like YouTube Premium, which includes an ads-free experience, the regular version of YouTube has plenty of privacy controls built in. For example, we recently brought Incognito mode, the popular feature in Chrome that lets you browse the web without linking any activity to you, to YouTube. You can view YouTube as a logged-in user or in Incognito mode.
To make privacy real, we give you clear, meaningful choices around your data. All while staying true to two unequivocal policies: that Google will never sell any personal information to third parties; and that you get to decide how your information is used. Here’s how it works:
First, data makes the products and services you use more helpful to you. It’s what enables the Google Assistant to book a rental car for your trip, Maps to tell you how to navigate home and Photos to share vacation pictures with a click of a button.
Second, products use anonymous data in aggregate to be more helpful to everyone. Traffic data in Google Maps reduces gridlock by offering people alternate routes. Queries in Google Translate make translations more accurate for billions of people. Anonymized searches over time help Search understand your questions, even if you misspell them.
Third, a small subset of data helps serve ads that are relevant and that provide the revenue that keeps Google products free and accessible. That revenue also sustains a broad community of content creators, which in turn helps keep content on the web free for everyone. The data used in ads could be based on, for example, something you searched for or an online store you browsed in the past. It does not include the personal data in apps such as Docs or Gmail. Still, if receiving a customized ads experience isn’t helpful, you can turn it off. The choice is yours and we try to make it simple.
Eight years ago, we introduced an easy way to export all your data from Google services — and even take it elsewhere. A few years later, we created the Google Account page as a place to review and adjust all of your privacy controls. Nearly 20 million people now visit it every day. But we know our work here is never done, and we want to do more to stay ahead of user expectations.
Last week, we announced significant new privacy features, including one-click access to privacy settings from all our major products and auto-delete controlsthat allow you to choose how long you want data to be saved. And to protect your data from security threats, we just introduced a security key built into Android phones that can provide two-factor authentication.
[Technology has made our lives easier. But it also means that your data is no longer your own. We’ll examine who is hoarding your information — and give you a guide for what you can do about it. Sign up for our limited-run newsletter.]
We’re also working hard to challenge the assumption that products need more data to be more helpful. Data minimization is an important privacy principle for us, and we’re encouraged by advances developed by Google A.I. researchers called “federated learning.” It allows Google’s products to work better for everyone without collecting raw data from your device. Federated learning is how Google’s Keyboard can recognize and suggest new words like “YOLO” and “BTS” after thousands of people begin typing them — without Google ever seeing anything you type. In the future, A.I. will provide even more ways to make products more helpful with less data.
Even as we make privacy and security advances in our own products, we know the kind of privacy we all want as individuals relies on the collaboration and support of many institutions, like legislative bodies and consumer organizations.
Europe raised the bar for privacy laws around the world when it enacted the General Data Protection Regulation. We think the United States would benefit from adopting its own comprehensive privacy legislation and have urged Congress to pass a federal law. Ideally, privacy legislation would require all businesses to accept responsibility for the impact of their data processing in a way that creates consistent and universal protections for individuals and society as a whole.
Legislation will help us work toward ensuring that privacy protections are available to more people around the world. But we’re not waiting for it. We have a responsibility to lead. And we’ll do so in the same spirit we always have, by offering products that make privacy a reality for everyone.
Sundar Pichai is the chief executive of Google.
Follow @privacyproject on Twitter and The New York Times Opinion Section onFacebook and Instagram.