Please ensure Javascript is enabled for purposes of website accessibility

2026: The year of tech, both heroes and villains

guest1//February 12, 2026//

2026: The year of tech, both heroes and villains

guest1//February 12, 2026//

Listen to this article

By Tom Wilmoth

Tech isn’t going away. In fact, 2026 promises both to show us the best and worst of it and to test our wisdom in picking the good and discarding the bad.

Because we all could use a little uplift, let’s start with a potential positive. Drunk driving is a long running scourge, killing an average of 400 people each year in North Carolina and causing catastrophic damage to lives and property. We know tragedy lies ahead, yet we as a society consistently fail to avert it.

The good news is that a bill parked in the North Carolina General Assembly (HB789) includes a provision to encourage first-time drunk driving offenders to install ignition interlocks in their vehicles in exchange for a lighter sentence. The devices can prevent a driver from being able to start a car if they’ve been drinking.

This is an example of enabling tech to intercede in a way courts, law enforcement and loved ones have not been able to. In other words, tech could help society do a better job of keeping all of us safe.

The idea of doing a better job, though, is exactly what makes A.I. so formidable and frightening. Will the algorithms overtake human competency and end up doing our jobs and ruling the world? That’s the headline-making hot take.

But apocalyptic visions aside, A.I. has many potential benefits. For example, federal regulators are weaving A.I. into the way we track vaccine safety—an upgrade with huge upside for public health if we steer clear of some obvious traps.

The Food and Drug Administration’s Vaccine Adverse Event Reporting System (VAERS)—highly relevant to our state’s pharma industry—features A.I. tools meant to catch worrisome patterns sooner and intervene before problems spread. Think of it as giving an old warning light a far sharper set of eyes. If it works, lives could be saved by spotting rare complications long before human reviewers would have pieced them together.

Smarter analytics could, in time, allow regulators to run complex simulations that cut down on the number of animals or volunteers needed in early-stage studies. A faster, cleaner evidence stream means researchers can make better decisions sooner—and patients get safer vaccines.

But we should all be watching two key factors. First, false alarms. Machine learning is only as good as its training data. If the model starts flagging large numbers of harmless events as dangerous, public confidence in otherwise safe vaccines could wobble.

Second, privacy and security. VAERS contains sensitive health information. Any new digital pipeline has to guard against leaks, breaches, or inadvertent sharing. Clear protocols and independent audits should be standard. We’ll see if these take shape.

And privacy brings us to a potential “worst”: the continued peddling of (UBI) programs. You may be caught up in one now. Typically, an insurance company entices you to use a device or phone app to track your driving (speed, braking, mileage, time of day), pitching it as a way for safe drivers to get discounts by demonstrating they’re low-risk.

But such programs can actually hurt you in ways you’re probably not aware of, and the consequences could be devastating. They can sell your data to other insurance companies and undisclosed organizations, use your data against you in a personal injury claim or leverage your data to raise your rates instead of lower them.

In North Carolina, if the insurance company can make it seem like you’re even 1 percent at fault for an auto accident, you could get nothing for your claim. Imagine having your own driver data used against you to deny you coverage when you’ve been in a serious car accident.

Sounds bad, right? But that’s tech for you—upsides and downsides. And it just keeps coming. Lawsuits and regulatory action could slow down UBIs and other maladies—though North Carolina is not moving to protect consumers the way some states are. Here’s urging policymakers and business leaders to favor tech benefits, like ignition interlocks, and guardrails over exploitation.

Tom Wilmoth is an attorney at the Law Offices of James Scott Farrin, one of the largest plaintiffs’ firms in the Carolinas.

 


Top Legal News

See All Top Legal News

Commentary

See All Commentary