Turning on the lights, hearing the weather forecast, learning fun facts, and playing your favorite song in the kitchen are simple when one can give short voice commands to a personal assistant device that is connected to the internet and to other devices in your home. Connected devices are increasingly being used in the home, not just for everyday tasks, but for babysitting children, securing the home, tracking fitness, and acting as marital aids. There are even connected devices marketed for use in the office or while traveling. It's almost unavoidable to encounter a device that cannot connect to the internet, a smartphone, or other devices. However, as we have reported previously, these devices can present serious privacy and security issues.

In light of these privacy and security issues, U.S. and international regulators alike have published guidance with the goal of standardizing internet of things ("IoT") device privacy and security. Mandatory regulations, however, are lacking, leaving companies to rely on best practices to minimize privacy and security risks.

California, though, in its usual trend to be a first mover in regulating new technology, has had an information privacy connected devices bill in the works since Feb. 13, 2017. In March 2017, we identified the bill and privacy concerns the state and regulators may be considering when it comes to connected devices. Less than a year later, in January 2018, the bill moved from the state's Senate to being considered in the state's Assembly. It has been read once and is currently being "held at desk" in the Assembly, waiting to be referred to a committee.

After being introduced, the bill was transformed substantially, with several of its proposed requirements for connected devices stripped entirely before it entered the Assembly. The bill at one point had both privacy and security related requirements, but now largely calls for security obligations.

Despite these changes, California seems to be attempting to further develop IoT privacy and security standards. The bill does not seem to be inconsistent with standard-making efforts, as it sets a reasonableness standard, and such standards have been historically determined by reference to industry standards and best practices.

As it stands, the bill applies to manufacturers that "sell or offer to sell a connected device to a consumer" in California. Connected device is defined as a "device, sensor, or other physical object that is capable of connecting to the Internet, directly or indirectly, or to another connected device." The bill obligates manufacturers to "equip the device with reasonable security features appropriate to the nature of the device and the information it may collect, contain, or transmit, that protect the device and any information contained therein from unauthorized access, destruction, use, modification, or disclosure."

This obligation may seem to cover security broadly, but the hook is the standard of "reasonable[ness]," the same standard the Federal Trade Commission (FTC) applies to data security generally. The bill would not obligate manufacturers to seek out the highest level of security measures on the market, but rather creates a floor of at least the most "basic security standards," according to the latest Senate Floor Analyses. It seems that the purpose of the bill is not so much to force companies to heighten their levels of security, but rather to ensure that IoT devices have some sort of security in place, such as basic encryption, as soon as they hit the market.

It is important to note that the bill defines "consumer" as a "person who purchases or obtains a connected device for personal or household use." It is unclear whether this would include companies that purchase connected products for their employees for work (rather than personal) use. It is also unclear whether the inclusion of the description of "obtaining" a device and possibly differentiating it from the action of "purchasing" is purposeful. For example, this inclusion may place liability on manufacturers to ensure that security measures are in place that would prevent initial purchasers from customizing the device in a way that may suit their personal needs but would lower the security in some way, before giving it or donating it to someone else once their use of the device runs its course. Perhaps an adult purchases a connected toy for personal use, recodes or tinkers with it in some way to make it do things it was not originally programmed to do, subsequently lowers its security measures via the tinkering (deliberately or accidentally), and gives it to a child as a gift. Under the current language of the bill, there may be some obligation on manufacturers to ensure that such tinkering cannot happen. This may clash with the needs of the hacker and tinkering community, which sometimes includes a company's most devoted users, customers, and followers.

The bill does state, "This title shall not be construed to impose any duty upon the manufacturer of a connected device to prevent a user from having full control over a connected device, including the ability to modify the software or firmware running on the device at the user's discretion." Although this makes it clear that a manufacturer is not obligated to bar a user's tinkering, read with the previous language, it also does not assure that companies do not have the obligation to at least ensure that such modifications to a device do not lead to a dilution of security measures.

That being said, the bill also carves out specific wiggle room for manufacturers. The bill states, "This title shall not be construed to impose any duty upon the manufacturer of a connected device related to unaffiliated third-party software or applications that a user chooses to add to a connected device." This may not put companies at ease that they are not required to prevent tinkering, but it certainly allows companies to not worry about compliance with the law if changes by a consumer are made by applying third-party software.

Another interesting carveout is the following:

A covered entity, provider of health care, business associate, health care service plan, contractor, employer, or any other person subject to the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA) (Public Law 104-191) or the Confidentiality of Medical Information Act (Part 2.6 (commencing with Section 56) of Division 1) shall not be subject to this title with respect to any activity regulated by those acts.

There is a market for connected devices to be used in the context of collecting and processing of health data. This includes connected toys being used in hospitals to calm children who may be going through health procedures. This carveout is perhaps the outcome of the desire to not have clashing security obligations, especially considering that HIPAA already contains stringent security requirements.

The law also states:

This title shall not be construed to impose any duty upon a provider of an electronic store, gateway, marketplace, or other means of purchasing or downloading software or applications, to review or enforce compliance with this title.

What companies may also want to take note of is the evolution of the bill itself. As previously mentioned, the bill has undergone several changes, and used to include several privacy obligations and other definitions that were then stricken.

Some strikes make sense in the context of current privacy and security legal trends.

Legislators recently struck the following: "'Connected device' shall not include a motor vehicle as defined in Section 415 of the Vehicle Code." This may make sense, as legislators realized that the purpose of the law was to instill basic security measures in the connected device market. Considering that vehicles can connect to phones and other devices, it seems odd to explicitly exclude them as a device. Vehicle manufacturers should note the difference between a connected car and an autonomous car. A connected car may allow users to connect a phone to its speakers to stream songs for a road trip, but an autonomous car may provide that ability as well as the ability to self-drive and self-navigate to the user's destination point. An autonomous vehicle is likely a connected vehicle, but a connected vehicle is not necessarily autonomous. This difference is important to understand in light of recent regulations passed in California. The California Department of Motor Vehicles recently adopted, on Feb. 26, 2018, regulations that specifically govern the testing and deployment of autonomous vehicles, including standards of privacy and security. While these regulations cover autonomous vehicles, the only legislation in consideration in California that may cover nonautonomous connected vehicles is the very bill discussed here. Therefore, the decision not to exclude vehicles from this bill is of note, considering that other legislation and regulations about the security of vehicles are focused specifically on autonomous vehicles.

Legislators also struck a definition of "deidentified information" as well as a short provision on an exclusion of deidentified information from the law. This may be because there is too much contention in technical and legal communities over whether deidentification of personal information is even possible. The idea is that deidentifying personal information would strip the information in a way that would effectively prevent reidentification to occur. In privacy and security circles there are groups that argue that there will always be technical ways to render reidentification, no matter how complicated the deidentification process.

The provision may also have been struck because if it had been included, this bill would have broadened the current scope of the application of deidentified data. Currently, the Health Insurance Portability and Accountability Act ("HIPAA") is the only U.S. privacy-centric regulation that mentions the term. HIPAA, however, applies only to health data and also explicitly provides strict approaches as to how organizations using deidentification measures can avoid liability under the regulation. Should the deidentification provision remain in this connected device bill, deidentification could be applied in contexts beyond health data. The inclusion of the provision may even have led to a dilution of what proper deidentification measures entail, as California legislators had yet to add in a provision explicitly explaining how appropriate deidentification may occur, which contrasts with HIPAA's strict and narrow approach, and the approach of Europeans on the issue.

Additionally, legislators may have struck this provision to further tailor the bill to focus on security rather than privacy. As discussed below, certain provisions that would have protected consumers' privacy were also struck. Deidentification is related more to privacy of data than to security of such data, since deidentification is a method of stripping certain aspects of data in order to protect the identity of a person, rather than to protect the data from being misused. By striking the provision on deidentification, the California legislators may have been opting for a more security-centric bill.

Indeed, probably the most notable provisions of the bill concerned privacy transparency, which is now totally absent. The bill used to state that manufacturers "shall provide notice through the use of words or icons on the device's packaging, or on the product's, or on the manufacturer's Internet Web site, of all of the following:

(a)Whether the device is capable of collecting audio, video, location, biometric, health, or other personal or sensitive user information, including specifying which type or types of information the device may collect, if that information is not otherwise indicated by packaging or by the stated functionality of the device.

(b)The process by which a connected device collects the information specified in subdivision (a), as well as the frequency of collection and what types of interactions with the device may trigger collection.

(c)If and how the consumer can obtain information about security patches and feature updates for the connected device."

IoT manufacturers should closely watch the bill to see if this or other privacy notice requirements find a way back into the bill as it progresses.

Overall, the California bill on IoT devices, as currently written, does not do much more than the obligation security standards that the FTC already applies under Section 5 of the FTC Act. It raises interesting questions, such as what it may change for manufacturers due to its definition of "consumer," and the bill does not expressly describe what it considers as "reasonable security features."

These questions may become clear with time, but what is clear now is that this bill, if passed, will provide consumers another outlet, beyond relying on FTC enforcement, for protecting their security. The bill states that existing law "authorizes a customer injured by a violation of [the bill's] provisions to institute a civil action to recover damages." As the bill will be added to "Part 4 of Division 3 of the Civil Code, relating to information privacy," the bill is likely referring to potential civil actions outlined in Division 4 of California's Civil Code. Division 3 creates obligations, or legal duties, that must be followed in California, while Division 4 (sections 3274-3428) describe what relief may be sought in the event of a breach of obligation. The relief ranges from compensatory relief to specific and preventive relief.

Additionally, despite certain privacy obligations being stricken from the bill, companies should still consider the benefits of employing privacy by design, following the Fair Information Practice Principles, and consider the FTC's general guidance on IoT devices and comments on draft guidance regarding communicating upgradability, security patches, and transparency.

Companies should also consider the evolving efforts to develop international standards, such as guidance published by the IoT Security Foundation from the United Kingdom, security best practices published by the Institute of Electrical and Electronics Engineers, a global nonprofit, and the National Institute of Standards and Technology's current draft of its Interagency Report on international cybersecurity IoT standardization. With the General Data Protection Regulation ("GDPR") becoming effective on May 25, 2018, companies with ties to Europe should also look to what European Data Protection Supervisors have discussed regarding IoT devices in each European member state.

While such best practices should be followed, maintaining such practices may not be feasible. For example, if a company decides to retire IoT device manufacturing and move into a different industry area, such as solely developing software, it would find difficulty in doing so if it were required to keep its IoT business running. Whether companies should be required to maintain data security indefinitely is an interesting policy question that legislators may consider in the future.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.