from the outset of the development process, when it is much more likely to be effective110. That said, there is so far little evidence of market demand for privacy-friendly services – partly because of the difficulties individuals have in assessing and weighing up complex privacy risks. And while regulators have been discussing privacy by design for over a decade, the specifics of implementation have been limited so far111.Companies can undertake “privacy impact assessments” when designing IoT systems, to consider how different design options might affect privacy. This can also reduce the risk of expensive delays and system redesigns – as was extensively debated during the development of the Netherlands’ smart meter programme112.A significant amount of work already has been done on security and privacy issues by policy-makers and regulators in the EU and United States. Under the General Data Protection Regulation being debated in the European Parliament and Council of Ministers, there will be stronger regulatory incentives for companies developing systems that process personal data to protect security and privacy by design. The U.S. FTC also suggests that companies follow a “defence in depth” approach. This involves considering security measures at several different points in their systems, such as using access-control measures and encrypting data even when users are making use of encrypted links to home Wi-Fi routers. Of course, this will not protect the data between the router and the company’s servers, or if the router is badly configured113. Privacy is a particularly strong regulatory issue in European countries. A comprehensive legal framework includes the Council of Europe’s European Convention on Human Rights and Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, as well as the EU Charter of Fundamental Rights. This framework has been influential in the development of comprehensive privacy laws now in force in more than 100 countries around the world114. The EU already has a very detailed legal framework regulating the public and private sector’s use of personal data, with a general Data Protection Directive (95/46/EC) relevant to IoT device manufacturers, social media platforms and app developers that access IoT data, and an e-Privacy Directive (2002/58/EC) also relevant to IoT device manufacturers115. The European Commission has already sponsored a process to create an RFID privacy code of practice, developed collectively by industry and civil society and approved by the EU’s data protection authorities116.These authorities have issued a detailed opinion on the IoT’s implications for privacy protection. They note that the IoT produces high-volume flows of personal data that could present challenges to traditional data protection regulation. For example, individuals will not necessarily be aware when data is shared or able to review this data before it is sent to other parties, creating a risk of self-exposure and lack of control117. A further privacy issue is the amount of personal information that can be derived from seemingly innocuous sensor data, especially when it is combined with user profiles and data from other sources. As European privacy regulators noted, “Full development of IoT capabilities may put a strain on the current possibilities of anonymous use of services and generally limit the possibility of remaining unnoticed.118” Smart meter data, for example, can be surprisingly revealing about individuals’ day-to-day activities, down to the detail of which programmes are being watched on a television119. Researchers have found that smart phone sensor data can be used to infer information about users’ personality types, demographics, and health factors such moods, stress levels, smoking habits, exercise levels and physical activity – even the onset of illnesses such as Parkinson’s disease and bipolar disorder120. This kind of information has obvious positive applications, such as in pricing health insurance. But it can also be used for other decisions related to employment, credit and housing. This could lead to economic discrimination against individuals classified as poor credit or health risks, or potentially to “new forms of racial, gender, or other discrimination against those in protected classes, if Internet of Things data can be used as hidden proxies for such characteristics121.”To protect individuals’ privacy, the FTC has suggested that notice and consent be required when personal data is collected by IoT applications outside the consumer’s reasonable expectation. That expectation should be based on the 90 Trends in Telecommunication Reform 2016