Top 10 Ethical Dilemmas in Science for 2020

  • <<
  • >>

 Top 10 Ethical Dilemmas in Science for 2020

For the eighth consecutive year, Dr. Jessica Baron, in collaboration with the John J. Reilly Center for Science, Technology and Values at the University of Notre Dame, has released the annual list of emerging ethical dilemmas and policy issues in science and technology for 2020.

The list, which features up-and-coming technology extensively used in the science and technology industries, is released annually in mid-December. The thought-provoking selections are intended to ramp up dialogue among citizens and scientists alike. As Baron writes, “We live in an era of rapid development as technologies that seemed theoretical only a few years ago are increasingly incorporated into our daily lives. Our concern is that there’s little public dialog about the use and risks of these technologies, and that dialog is necessary to keep public policy in pace with science and technology.”

Here is Baron’s 2020 list, presented in no particular order.

 skincare

1. The Pseudoscience of Skincare

Society is vein—no surprise there. But the skincare market is taking advantage of that fact, with “skin  tech” expected to be worth $12.8 billion in 2020. The subcategory of skin tech includes, but is not limited to: LED masks, electronic face scrubbers, facial massagers, smart mirrors and skincare cameras. The problem here is that beauty companies market themselves as “clinically proven” when that is, in fact, not the case. Most research done by manufacturers does not meet the scientific method and is not reproducible. The experts hired to tout these products are not scientists either—they are often celebs or even dermatologist-celebs who have their own agenda.

 

2. AI and Gamification in Hiring

Here Baron asks a startling question: are you your data? While hiring companies can already see a candidate’s social media history, some companies are going a level beyond and using neurological games and emotion-sensing recognition as part of their assessments. If taken to the extreme, this means a machine could decide if you are right for a position based entirely on your responses to a game of your facial expressions. Nevermind your resume, your phone interview, your in-person interview, or your impressive track record—it could all be for naught.

CSI

3. Predatory Journals

Researchers estimate there are roughly 8,000 predatory journals, or journals that lack ethical practices such as peer-review and have extremely low standards. The thing is, when these journals publish anything, the information becomes fodder for unknowing researchers and scientists who are duped into believing it’s the truth. Given the immense amount of pressure on academics to publish, some become desperate enough to—intentionally or unintendedly—engage with these predatory journals. As you’ll read later on in this list, fake data is not something we can afford much more of.

4. The HARPA SAFEHOME Proposal

President Donald Trump’s White House is considering a controversial plan to monitor the mentally ill as a way to stop mass shootings in the U.S.—a program that sounds a lot like a real-life Minority Report. HARPA, run by a third-party pancreatic cancer foundation with no governmental ties, would leverage data available on phones and smartwatches to detect when mentally ill people are about to turn violent. Beyond the infringement of civil liberties, research has not found reliable benchmarks to predict violent behavior, or even classify the mentally ill versus non-mentally ill.

classdojo

5. Class Dojo and Classroom Surveillance

ClassDojo is a popular online tool that, through recording in the classroom, scores children on their behavior, and then shares that with the class, as well as parents. The system’s company says it is meant to foster positive behavior in the classroom, but pundits raise more than a few concerns, including: 1) can the information be hacked; 2) how is good behavior quantified/defined?; and 3) does it promote anxiety/shame among students?

6. Grinch Bots

Aptly named “Grinch Bots” include online entities that buy up popular goods as soon as they hit the market in order to control supply and demand. Once the goods are sold-out, they are resold on the secondary market at an inflated price. This isn’t a new problem, but there also isn’t a new solution, either. In 2016, Congress passed the Better Online Ticket Sales (BOTS) Act, but it hasn’t been very effective. The Stopping Grinchbots Act 2018 was introduced last year and is currently awaiting more action from the House. However, the bill would only make it illegal to resell products purchased by automated bots, and obviously doesn’t apply to the rest of the world.

7_ethical.jpg

7. Project Nightingale

Dubbed Project Nightingale, this partnership sees Ascension, the second-largest health care system in the U.S., collaborate with Google to host health records on the Google Cloud. With roughly 2,600 hospitals, doctors’ offices and other related facilities spread over 21 states, it holds 10s of millions of patient records. Both companies signed a HIPAA (Health Insurance Portability and Accountability Act), meaning Google can’t do anything with the records other than provide a cloud hosting service. However, The Wall Street Journal reported that neither doctors nor patients had been informed of what was happening with these records and that roughly 150 Google employees had access to the data. As data increasingly moves to the cloud and other storage options, and companies such as Microsoft and Apple also launch health projects, we have be ensure our data is protected.

8. Student Tracking Software

Universities are increasingly using predictive analytics to—essentially—stalk a candidate. Some college websites use software that reveals the name, age, ethnicity, address and contact information of a candidate, as well as which specific college sub-pages he/she visited and how long was spent on each web page. The college then uses these factors to determine an “affinity score” that decides how likely a candidate is to accept an offer from the college. But, Baron says, when colleges assign scores to students based on income and interest, it strips applications of much of their context and it also discriminates against low-income students or those without dedicated Internet access. The analytics have the potential to harm a prospective student’s college admission based on an algorithm that assumes ideal candidates.

ethics

9. The Corruption of Tech Ethics

When CRISPR-Cas9 gene editing went mainstream in 2012, researchers immediately called a moratorium due to the high-power potential of the system. There were then nationwide meetings, international meetings, multiple groups got involved—overall, it went exactly as it was supposed to. Now, however, the legitimacy of the ethical researcher is taking a hit as lawyers, business people, journalists and others muddy the waters. Ethics officers need to have rigorous training and understand the frameworks for ethical decision making. Otherwise, ethics turns into a merry-go-round.

 

10. Deep fakes

Manipulating video and audio to make it appear as something it is not is not new. However, the recent application of deep learning to create hard-to-identify fakes is more sophisticated, and more concerning. States are attempting to build legislation against deep fakes, and companies like Facebook and Microsoft want to help develop tools to spot them. But these days, just about anyone can download deep fake software to create fake videos or audio recordings that look and sound like the real thing—and nothing gets deleted from the Internet.

For a more complete analysis for each issue, as well as additional resources on the topic, visit the Tech Top Ten website.