The largest collection of public internet censorship data ever compiled shows that even citizens of what are considered the world’s freest countries aren’t safe from internet censorship.
A team from the University of Michigan used its own Censored Planet tool, an automated censorship tracking system launched in 2018, to collect more than 21 billion measurements over 20 months in 221 countries.
“We hope that the continued publication of Censored Planet data will enable researchers to continuously monitor the deployment of network interference technologies, track policy changes in censoring nations, and better understand the targets of interference,” said Roya Ensafi, U-M assistant professor of electrical engineering and computer science who led the development of the tool.
Poland blocked human rights sites, India same-sex dating sites
Ensafi’s team found that censorship is increasing in 103 of the countries studied, including unexpected places like Norway, Japan, Italy, India, Israel and Poland. These countries, the team notes, are rated some of the world’s freest by Freedom House, a nonprofit that advocates for democracy and human rights.
They were among nine countries where Censored Planet found significant, previously undetected censorship events between August 2018 and April 2020. They also found previously undetected events in Cameroon, Ecuador and Sudan.
While the United States saw a small uptick in blocking, mostly driven by individual companies or internet service providers filtering content, the study did not uncover widespread censorship. However, Ensafi points out that the groundwork for that has been put in place here.
“When the United States repealed net neutrality, they created an environment in which it would be easy, from a technical standpoint, for ISPs to interfere with or block internet traffic,” she said. “The architecture for greater censorship is already in place and we should all be concerned about heading down a slippery slope.”
It’s already happening abroad, the researchers found.
“What we see from our study is that no country is completely free,” said Ram Sundara Raman, U-M doctoral candidate in computer science and engineering and first author of the study. “We’re seeing that many countries start with legislation that compels ISPs to block something that’s obviously bad like child pornography or pirated content.
“But once that blocking infrastructure is in place, governments can block any websites they choose, and it’s a very opaque process. That’s why censorship measurement is crucial, particularly continuous measurements that show trends over time.”
Norway, for example–tied with Finland and Sweden as the world’s freest country, according to Freedom House–passed laws requiring ISPs to block some gambling and pornography content beginning in early 2018.
Censored Planet, however, uncovered that ISPs in Norway are imposing what the study calls “extremely aggressive” blocking across a broader range of content, including human rights websites like Human Rights Watch and online dating sites like Match.com.
Similar tactics show up in other countries, often in the wake of large political events, social unrest or new laws. News sites like The Washington Post and The Wall Street Journal, for example, were aggressively blocked in Japan when Osaka hosted the G20 international economic summit in June 2019.
News, human rights and government sites saw a censorship spike in Poland after protests in July 2019, and same-sex dating sites were aggressively blocked in India after the country repealed laws against gay sex in September 2018.
Censored Planet releases technical details for researchers, activists
The researchers say the findings show the effectiveness of Censored Planet’s approach, which turns public internet servers into automated sentries that can monitor and report when access to websites is being blocked.
Running continuously, it takes billions of automated measurements and then uses a series of tools and filters to analyze the data and tease out trends.
The study also makes public technical details about the workings of Censored Planet that Raman says will make it easier for other researchers to draw insights from the project’s data, and help activists make more informed decisions about where to focus.
“It’s very important for people who work on circumvention to know exactly what’s being censored on which network and what method is being used,” Ensafi said. “That’s data that Censored Planet can provide, and tech experts can use it to devise circumventions.”
Censored Planet’s constant, automated monitoring is a departure from traditional approaches that rely on volunteers to collect data manually from inside countries.
Manual monitoring can be dangerous, as volunteers may face reprisals from governments. Its limited scope also means that efforts are often focused on countries already known for censorship, enabling nations that are perceived as freer to fly under the radar.
While censorship efforts generally start small, Raman says they could have big implications in a world that is increasingly dependent on the internet for essential communication needs.
“We imagine the internet as a global medium where anyone can access any resource, and it’s supposed to make communication easier, especially across international borders,” he said. “We find that if this continues, that won’t be true anymore. We fear this could lead to a future where every country has a completely different view of the internet.”
IMSI-Catchers from Canada
Gizmodo is reporting that Harris Corp. is no longer selling Stingray IMSI-catchers (and, presumably, its follow-on models Hailstorm and Crossbow) to local governments:
L3Harris Technologies, formerly known as the Harris Corporation, notified police agencies last year that it planned to discontinue sales of its surveillance boxes at the local level, according to government records. Additionally, the company would no longer offer access to software upgrades or replacement parts, effectively slapping an expiration date on boxes currently in use. Any advancements in cellular technology, such as the rollout of 5G networks in most major U.S. cities, would render them obsolete.
The article goes on to talk about replacement surveillance systems from the Canadian company Octasic.
Octasic’s Nyxcell V800 can target most modern phones while maintaining the ability to capture older GSM devices. Florida’s state police agency described the device, made for in-vehicle use, as capable of targeting eight frequency bands including GSM (2G), CDMA2000 (3G), and LTE (4G).
A 2018 patent assigned to Octasic claims that Nyxcell forces a connection with nearby mobile devices when its signal is stronger than the nearest legitimate cellular tower. Once connected, Nyxcell prompts devices to divulge information about its signal strength relative to nearby cell towers. These reported signal strengths (intra-frequency measurement reports) are then used to triangulate the position of a phone.
Octasic appears to lean heavily on the work of Indian engineers and scientists overseas. A self-published biography of the company notes that while the company is headquartered in Montreal, it has “R&D facilities in India,” as well as a “worldwide sales support network.” Nyxcell’s website, which is only a single page requesting contact information, does not mention Octasic by name. Gizmodo was, however, able to recover domain records identifying Octasic as the owner.
Asset tracking is one of the highest growth application segments for the Internet of Things (IoT). According to a report by ABI Research, asset tracking device shipments will see a 51% year-on-year device shipment growth rate through 2024.
Expanding LPWAN coverage, technological maturity, and the associated miniaturization of sophisticated devices are key to moving asset tracking from traditionally high-value markets to low-value high-volume markets, which will account for most of the tracker connection and shipment numbers.
“Hardware devices for the asset tracking market are primarily dominated by the need to balance power consumption, form factor, and device cost. Balance and compromise between these three must be achieved based on the use-case and are dictated by the business case and possible return on investment for the customer,” said Tancred Taylor, Research Analyst at ABI Research.
“As these constraints are marginalized by greater volumes of adoption, by emerging technologies like eSIM or System-on-Chip, and by increasingly low-power components and connectivity, so too will the limitations on the business case.”
OEMs diversifying their hardware offerings
Expanding technological and network foundations drive the number of use-cases, and OEMs are responding by diversifying their hardware offerings. Some OEMs such as CoreKinect, Particle, Mobilogix, or Starcom Systems are innovating in this space by taking a reference-architecture or modular approach to device design for personalized solutions. Others are going to market with off-the-shelf or vertically-focused devices for quickly scalable deployments – such as BeWhere, Roambee, Sony, or FFLY4U.
Early adoption of asset tracking was in the fleet, container, and logistics industries to provide basic data on the location and condition of assets in transit. The total addressable market for these industries remains extensive, particularly as the solutions trickle down from the largest enterprises to small- and medium-sized companies.
Increased device functionality combined with component miniaturization is key to driving the next generation of low-cost tracking devices. This will enable granular tracking at the pallet, package, or item level, and open new markets and device categories, such as disposable trackers. Emerson, Sensitech, CoreKinect, and Bayer are among companies driving innovation in this field.
Product innovation accompanied by variations in business models
Innovation among product offerings is accompanied by variations in business models and go-to-market approaches. Mobile Network Operators (MNOs) are playing a significant role in driving adoption through increased verticalization, with Verizon, AT&T, and Orange among those offering subscription models for end-to-end solutions – comprising device, connectivity, software, and managed service offerings.
This model is additionally gaining traction among OEMs, with Roambee an early adopter for a subscription-only model, and others such as Mobilogix following suit. This service-based model will gain additional traction as OEMs move down the value-chain by developing in-house capabilities or partner networks to simplify the ecosystem and consumer’s solution.
“While there is extensive work to be done on the hardware side to make low-cost trackers that can be simply attached to any ‘thing’, many OEMs are shifting from a hardware-only model to more of a consultative approach to a customer’s requirements and deliver personalized end-to-end solutions. Flexibility, simplicity, and cost are crucial to gain enterprise traction,” Taylor concluded.
Tracking of our browsing behavior is part of the daily routine of internet use. Companies use it to adapt ads to the personal needs of potential clients or to measure their range. Many providers of tracking services advertise secure data protection by generalizing datasets and anonymizing data in this way.
Tracking services collect large amounts of data of internet users. These data include the websites accessed, but also information on the end devices used, the time of access (timestamp) or location information.
“As these data are highly sensitive and have a high personal reference, many companies use generalization to apparently anonymize them and to bypass data security regulations,” says Professor Thorsten Strufe, Head of the “Practical IT Security” Research Group of KIT.
By means of generalization, the level of detailing of the information is reduced, such that an identification of individuals is supposed to be impossible. For example, location information is restricted to the region, the time of access is limited to the day, or the IP address is shortened by some figures.
Strufe, together with his team and colleagues of TUD, have now studied whether this method really allows no conclusions to be drawn with respect to the individual.
With the help of a large volume of metadata of German websites with 66 million users and over 2 billion page views, the computer scientists succeeded in not only drawing conclusions with respect to the websites accessed, but also with respect to the chains of page views, the so-called click traces. The data were made available by INFOnline, an institution measuring the data range in Germany.
The course of page views is of high importance
“To test the effectiveness of generalization, we analyzed two application scenarios,” Strufe says. “First, we checked all click traces for uniqueness. If a click trace, that is the course of several successive page views, can be distinguished clearly from others, it is no longer anonymous.”
It was found that information on the website accessed and the browser used has to be removed completely from the data to prevent conclusions to be drawn with respect to persons.
“The data will only become anonymous, when the sequences of single clicks are shortened, which means that they are stored without any context, or when all information, except for the timestamp, is removed,” Strufe says.
“Even if the domain, the allocation to a subject, such as politics or sports, and the time are stored on a daily basis only, 35 to 40 percent of the data can be assigned to individuals.” For this scenario, the researchers found that generalization does not correspond to the definition of anonymity.
A few observations are sufficient to identify user profiles
In addition, the researchers checked whether even subsets of a click trace allow conclusions to be drawn with respect to individuals.
“We linked the generalized information from the database to other observations, such as links shared on social media or in chats. If, for example, the time is generalized precisely to the minute, one observation is sufficient to clearly assign 20 percent of the click traces to a person,” says Clemens Deusser, doctoral researcher of Strufe’s team, who was largely involved in the study.
“Another two observations increase the success to more than 50 percent. Then, it is easily obvious from the database which other websites were accessed by the person and which contents were viewed.” Even if the timestamp is stored with the precision of a day, only five additional observations are needed to identify the person.
“Our results suggest that simple generalization is not suited for effectively anonymizing web tracking data. The data remain sharp to the person and anonymization is ineffective. To reach effective data protection, methods extending far beyond have to be applied, such as noise by the random insertion of minor misobservations into the data,” Strufe recommends.
Privacy concerns will ratchet up further around IoT and 5G. Even if the industry manages to secure the billions of IoT devices already deployed, they permeate so many aspects of life that it will be nearly impossible to keep personal and private information out of the public domain.
The rollout of 5G will further accelerate the proliferation of IoT technology as manufacturers rush to produce low-cost devices with integrated connectivity. All Mobile Network Operators (MNOs) are keen to adopt 5G, with IoT and Enterprise services being primary drivers, providing operators with access to new revenue opportunities from new services and applications.
The proliferation of private data in the public domain will expand hackers’ capabilities. Social engineering is the most effective method cybercriminals use to breach secure systems. They know consumers will continue to connect more devices in their homes, offices, and cars, not to mention public spaces, allowing them to create a more complete picture of a person’s activities, locations, likes and dislikes.
Even when these gadgets use encryption to transfer data, the backend systems with which they communicate may have their own flaws. And, even anonymized data can be used to infer a lot when cross-correlated. The Princeton University IoT Research Project had this to say about the phenomenon:
“Let’s say you have a Roku TV and that you are live-streaming the Bloomberg Channel without interacting with the TV otherwise. Do you know that the Bloomberg Channel could be communicating with 13 different advertising and tracking servers in the background? Or let’s say you have a smart Geeni light bulb. Are you aware that it could be communicating with a Chinese company every 30 seconds even while you are not using the bulb?”
One might recall the loyalty card craze of the 80s which spurred the IT storage market and opened the door to the broad adoption of data science technologies. Customers began to feel more and more uneasy about the level of detail companies were tracking and able to infer about them. IoT may take this to a whole new level.
Smart connected devices are making the idea of Big Brother much more real; businesses can know what time their customers wake up in the morning, when they brush their teeth, when they put the baby to sleep, when they vacuum the living room, and what they watch on TV.
Customers might not feel violated today, but all this data could come back to haunt them in the future as more and more complete models of our lifestyles are built and used within algorithms that could make decisions that profoundly affect us e.g. banks could deny loans, insurance companies could increase their premiums.
The data that represents our interactions with the connected world is undoubtedly valuable, and regulatory frameworks rightly exist to ensure it is used responsibly and stored / transferred securely; however, the speed of innovation and the range of information are changing the game. The time is now to design systems with visibility, transparency, and security integrated from the start.