Decanda Faulk (df@faulk-associates.com) is Healthcare Consultant at Comprehensive Care Management Solutions Inc. and Senior Commercial & Compliance Counsel at Law Office of Decanda Faulk P.C. in Newark, NJ, where she serves as outside consulting general counsel for two companies.
We live in a digital, mobile world where advanced technology allows businesses to acquire, access, alter, and create an abundance of data—the majority of which are analyzed through data science.[1] Data science allows data to be analyzed in a manner a company finds beneficial in many ways. Personal/sensitive data are a big part of the data these technology- and data-driven resources touch and data scientists, who are currently in an unregulated profession, analyze. With technology and data becoming increasingly more used in our daily lives, having a governing set of principles is prudent to ensure they pose no harm.
Public support for smarter technology and applications exists, but sadly, as innovation in technology- and data-driven products continues to advance, the public’s trust is eroding. People want reassurance that they can trust these resources. Therefore, it is wise for big data and big tech to adopt an ethical approach to technology deployment and data practice to strengthen public trust and maintain support for advanced technologies.
To maximize the benefits of these technologies and practices, creating a targeted technology and data practice code of conduct, endorsed by the C-suite and corporate board, signals to employees and the public that a company takes its commitment to create and maintain the responsible development, deployment, and use of technology seriously. This message also shows employees that their employers want them to uphold said commitment. The government also has a bigger role to play in promoting the responsible development and deployment of technologies, specifically data-driven technologies and artificial intelligence (AI). Moreover, the government has a duty to the public to ensure that technology is not developed and deployed in harmful ways.
Compliance remains essential to an organization’s viability. When compliance fails, the public often perceives every aspect of the organization, including its workforce, as untrustworthy or, worse, corrupt. The compliance failures of a few can taint an entire organization and harm many innocent people. Undoubtedly, ethical lapses of giant tech companies, data breaches, and discriminatory AI, which are becoming all too common with the rise of technological innovations, are unacceptable and avoidable compliance failures. Therefore, the technology industry and businesses in general need to reassure end users that they are acting in the users’ best interests. A written technology and data practice code of conduct that governs behavior relating to product design developments and deployments, as well as data practice, is one way to reassure the public.
A technology and data practice code of conduct is prudent
With the prevalence of technology, every business is now considered a technology company, which places a responsibility on them to be better in how they deploy technology and handle data. Like universities, which are starting to adopt a more medicine-like moral compass approach to computer science,[2] other businesses and the United States government can take a more ethical approach to technology innovations and data practice.
A medicine-like moral compass approach refers to the important step in becoming a doctor that requires medical students to take the Hippocratic Oath, which includes a promise to “do no harm.” Moreover, living in a privacy-, technology-, data-, and algorithm-driven world, where great functionality can coexist with risible data practices, a company’s use and overall oversight of technology should no longer begin and end with its chief information (or technology) officer. Rather, the responsibility should now extend up through the C-suite and to the corporate board,[3] and they should be supported by salient government guidance.
With the increasing importance of advancements in technology and data practice, more action is required. Like any advancements, it behooves society to ensure such progress does more good than harm. Consistent with other obligations to end users, employers likely want to promote an understanding to their employees—and earn the public’s trust—of a commitment to develop and use technological data-driven tools and AI in an ethical, transparent, and accountable manner. A technology and data practice code of conduct can assist in this endeavor.
A technology and data practice code of conduct serves many purposes, the most significant being to signal a top-down approach to the ethical, transparent, and accountable development and deployment of engineered technology products and data practices. Basically, having such a code in place sets the moral compass for developing technology and data practice. If more companies (especially tech start-ups) consider a technology and data practice code of conduct approach, it can reassure the public that technology does not have to come with harmful consequences, and such code can serve as a guide to employees’ decision-making. Incorporating privacy by design as supported by the International Association of Privacy Professionals and some tech companies is another promising initiative. However, an important first step and demonstrative commitment to effective compliance and ethics is a well-written and targeted code of conduct.
App designers, app developers, and employees deserve guidance in their decision-making to innovate (or procure) in a more thoughtful, transparent, and accountable fashion, with a top-down commitment to creating (or acquiring) technology in accordance with a company’s values and principles. An effective technology and data practice code of conduct can represent a pathway to a safe and trusted environment where technology can continue to flourish in a thoughtful, accountable, ethical, and transparent manner.
Over the years, many of the most meaningful compliance-driven initiatives have been required to have buy-in from the board and C-suite to be implemented, and they have been necessary to mitigate against harmful conduct. A technology and data practice code of conduct can guide employees in following a uniform company vision and mission to build trustworthy engineered products; deploy technology, including platforms, applications, and so on, responsibly; and handle personal and sensitive data ethically and accountably. Moreover, it places a greater emphasis on being proactive to mitigate discriminatory AI practices.