Alexander Trevino
Cybersecurity Ethics
Dr. Wittkower
As a young developer at a Toronto ad agency working on health sites for drug companies under tight Canadian rules, Bill Sourour later tells the story. Adolescent girls were the target for one assignment. Neutral guidance is what the questionnaire seemed to offer, yet almost every path funneled users to the clients medication. Only when someone reported an allergy or confirmed current use did the result differ. Helpful information rather than advertising is how the site presented itself. A news story soon described a teen girl who died by suicide while taking that drug, which already carried known risks of major depression and suicidal thoughts. Deep regret, a wounded conscience and a decision to leave are what Sourour describes. The project abandoned the habits of care son on a care ethics account the work was wrong. Real needs went unnoticed, responsibility for users was not owned, skill was applied in a way that shrank agency, and feedback that should have prompted change was ignored. Well being and choice should have come first, risks should have been made clear, manipulative logic should have been rejected and review should have been pursued or the work should have been refused. That conclusion also matches the professional codes and Armstrong’s view of professional duty.
Care ethics focuses on relationships, context, and vulnerability. It asks four basic questions. Did you notice what the other person actually needs? Did you take responsibility for meeting that need? Did you act with enough skill to do it well? Did you stay open to feedback and change when harm appeared? It also values freedom from domination and a relational view of autonomy rather than isolated choice.
Apply that outlook to the site. The design and tone made the team look like a helper in a health setting. When a tool invites trust like that, users expect watchfulness and responsiveness. The hidden logic did the opposite. The outcome was fixed in advance. That is not caring attention. It is steering that hides itself. The audience was especially sensitive to social pressure and mood risk. The product sounded like their needs mattered, but it removed real choice. From a care perspective, that is a failure to accept responsibility. The makers stepped into a caregiving posture, then used that closeness to serve the sponsor.
Professional codes say the same thing in different words. The ACM Code and the joint ACM and IEEE Software Engineering Code require us to avoid harm, be honest and transparent, respect user dignity and independence, and put the public interest first. A care perspective turns those principles into concrete duties inside the developer user relationship. When a medicine carries serious mental health risks, dangers must be visible before any suggestion appears. That means plain language warnings, links to credible independent sources, and prompts to speak with a clinician. The site dimmed risk and made an unwise choice more likely. Honesty also requires accurate naming. A marketing funnel should not be called a quiz. That label breaks the caring practice of truthful presence. If we respect autonomy, we avoid designs that exploit trust and information gaps. The interface should have widened the agency, not married it. Serving the public means treating users and their families as people who can rightly expect conscientious work from us.
A caring version of the feature would look very different. A real decision aid would use clear logic with more than one outcome. It would present risk information early and in simple language, and include crisis resources for mood changes. IT would add a pause before any recommendation so users read and acknowledge key facts. IT would welcome user feedback and outside clinical review, and it would change course if problems surfaced. Inside the company, ethical concerns would be logged, and medical oversight would be required before launch.
Within this outlook, Sourour’s options become clearer. In planning, he could have named the harm plainly. He could have said the feature simulates help while steering and that the audience is minors who deserve extra care. He could have proposed true branching logic, visible warnings, and independent medical review. If managers resisted, he could have raised the issue with compliance, legal, or a clinical reviewer and kept a written record. If those routes failed, he could have refused to shop the code and asked for the reassignment. If that was not possible, he could have stepped away and documented his reasons. These actions align with both the codes and the four practices of care. The later tragedy shows what was at stake, but the moral wrong existed earlier. The relationships with users were treated as a sales interaction rather than care, and harm had already begun. Armstrong ties professional ethics to trust and specialized knowledge. On experts cannot easily judge risk or verify claims, so professionals take on duties that go beyond ordinary market behavior. Those duties include independent judgement, frank disclosure about limits, loyalty to the public good, and the resolve to resist organizational pressure when trust would be betrayed.
Being a professional is not only finishing tasks. It means binding your discretion to the interests of those affected by your work. This account compliments care ethics. Both focus on unequal power and responsibility. When a non-expert uses a tool that touches health, the developer who holds technical knowledge should act as a steward. That means noticing needs, owning the obligations that come with expertise, practicing with integrity and skill, and staying ready to revise when feedback reveals harm. The site showed the reverse posture. Technical skill was used to shape behavior rather than support health.
With these ideas in mind, the actions become easier to judge. Independent judgement asks a developer to pause and asks what professional sense requires, given what they know about user influence and health risk. Red flags should have been obvious. Calling a sales funnel a quiz, aiming it at teenagers, and muting risk cues predictably undermines good decisions. Candor requires plain talk about what the tool is and is not. A caring professional would insist on the accurate label of sponsored product information, together with clear clinical disclaimers that people will read and understand. Loyalty to the public good means our final duty runs to the people affected by our code. In a care frame, that looks like everyday solicitude for safety and dignity. Integrity sometimes requires refusal. Armstrong highlights moments like that. Care ethics calls this taking responsibility for relational harm. Sometimes the right response is to say no.
What concrete steps follow from the combined view. In design review he could have challenged the rule that every path ends with the client drug. He could have asked for a real clinical decision tree based on clear criteria and checked by medical staff. He could have insisted on truthful naming and strong risk visibility, placing warnings up front and prompting a talk with a clinician before any suggestion appears. He could have helped set internal guardrails that block builds when recommendation logic lacks alternatives or when warnings miss a readability target. He could have written a brief record of dissent tied to the codes and to user safety, then escalated to leaders or to the sponsor’s safety contact. If every path stayed blocked, he could have refused the assignment or left. Care ethics also includes self care and integrity. Keeping one’s moral agency intact supports a wider culture of care.
This extends beyond one story. Software now sits close to medicine and public life. Code shapes choices about health, credit, housing, and safety. Care ethics says our practice must start with relationships, not conversion or clicks. When stated requirements contradict that truth, professionalism asks us to rewrite the relationship. Sometimes that means changing a feature, slowing a release, bringing in an outside expert, or leaving the work. This is ordinary professional care, not hero talk.
The site wore the look of care while withholding the real thing, so on a care extended account it was wrong. IT ignored user needs, shrugged off responsibility for predictable risk, used technical skill to channel behavior, and failed to respond to teens facing a prescription choice. Professional codes call for honesty, the avoidance of harm, and service to the public. Armstrong explains why those duties bind even more tightly when experts meet vulnerable people. A caring path would have rebuilt the feature around truthful guidance, highly visible risk information, and clinical oversight. If those protections were blocked, the right move was to refuse to ship or step away.
Some will say the work was legal and that persuasion is what marketing does. Law is a floor, not a full moral defense, and persuasion in health must respect autonomy and safety. Others will say the link to later harm is indirect. Care ethics answers that the wrong begins upstream. Harm has already been done when trust is manipulated and someone’s choices are narrow, even before outcomes appear in the news.
Leave a Reply