Log Out | Member Center

54°F

78°/56°

People unknowingly add to ever-growing mountain of available personal data

  • The Wichita Eagle
  • Published Saturday, Jan. 19, 2013, at 9:25 p.m.
  • Updated Saturday, Jan. 19, 2013, at 9:37 p.m.

What do a philosopher, a law school dean, a technologist and a private investigator named Emery Goad all have in common?

This:

They say we humans are creating huge databases about our personal information, our tastes, our flirtations, our finances.

We’re doing this with nearly every phone call, text, keystroke, Facebook posting and store purchase. We’re unwittingly sketching out glimpses of our virtues, vices, sins and souls.

Strangers are getting access to this information.

The new world we are creating with smart technology, says technologist Ravi Pendse, is wonderful but worth pondering.

In the last few days, as he and WSU President John Bardo talked about plans to expand WSU’s role in creating technology, Pendse said that perhaps people from the university’s philosophy department should be looped in, to look at the big picture.

Susan Castro teaches in that department. Pendse is right to think of them, she said. “We try to get to the heart of things, the most basic questions.”

Along with ethics and morals, she studies complex systems in her work as a philosopher and scholar.

She said complex systems have a way of doing “unexpected things” when they get too big.

Revealing ourselves

Emery Goad says we’d be surprised if we knew what he can learn about us with “a few flicks of a keyboard.”

All of it is legal, he said; all of it is easy.

He is a private investigator who chews cigars and does intrusive searches, many of them on computer keyboards.

Among other things, he does background checks on people, for landlords, employers and sometimes for people he doesn’t know.

Out of courtesy for his clients, he doesn’t want to talk about much of this work. But he has tips on how we reveal ourselves to strangers:

Any time we order takeout or delivery online, or with a phone, any time we order from Amazon, or buy food with a group card, we add information to databases about ourselves.

We create other databases when we pay bills, when we bank and when we enter risque words or photos in social media.

In criminal or civil court, all images and words we deleted can be retrieved.

The businesses we patronize automatically store our data. In the past, he said, they had little use for it. But then data miners appeared.

“Free money,” Goad said. “The data miners offer money, and all the businesses have to do is hand over the data.”

Future of data

Thomas Romig became Judge Advocate General of the U.S. Army 19 days after the Sept. 11 attacks. He led 9,000 lawyers and civilians working in the Army legal system in wartime.

He confronted the biggest ethical question of the war. Was waterboarding torture, and illegal? He said yes, in writing.

He’s now dean of the School of Law at Washburn University, and says new ethical and legal questions abound in law schools. About two years ago, he said, “We began stressing to students what we saw as the unfortunate fruits of getting on social media and doing and saying stupid things.”

Instructors began explaining to first-year law school students how the Facebook entries they might have posted years ago could come back to haunt them in Internet searches done by “future employers, even future clients.”

“We have seen the shocked looks on their faces,” he said. “We tell them that if they’ve got words or pictures that look embarrassing or don’t shed the best light on you, get them off of there. Better yet, never put them on there.”

Employers looking at Facebook accounts, or asking all job applicants if they can look, is a big issue.

“What is personal information and what is professional?” Romig said. “Let’s say the employer asks each applicant whether the company can look. If only one or two of the applicants say yes, and if you don’t hire anyone unless they say yes, do the other applicants have a cause of action against you? They may very well.”

At least three states, California, Illinois and Maryland, he said, have passed laws prohibiting employers from doing Facebook searches on job candidates. Most states have not yet considered such laws, he said.

What’s coming is far more complicated than Facebook embarrassments, he said. Goad is right about companies selling information. The law hasn’t kept up. “Credit companies have incredible amounts of information on all of us,” he said.

Giving our information “is important to business, important to commerce,” he said. “But where is the line? These credit card companies are selling our information, sometimes without paying attention to what they are doing with it.”

In the military, Romig’s fellow soldiers used flying drones to kill insurgents. In the future, drones will be used for everything from agriculture to traffic control.

“If a drone on routine traffic patrol over a downtown boulevard picks up something that looks illegal in someone’s backyard, is that information admissible?” Romig said. “The courts will wrestle with that. Was it in plain view? Does the homeowner have a reasonable expectation of privacy?”

To protect privacy, courts and lawmakers are way behind.

Unforeseen consequences

Susan Castro writes about German philosopher Immanuel Kant in her scholarship but reads science fiction for fun.

Any time you want scary examples of how technology creates chaos, she said, read science fiction. Technologists perhaps should consult science fiction writers, she said. It might give them pause.

What will technology do to us, our politics, our morals, our ethics, our democracy? For example, will technology further divide our have-nots from those who have more?

The possibilities are not all bad, she said.

“If the new fully integrated technology is controlled exclusively by a monopoly of the very wealthy, it could be a tool of great evil, as many science fiction writers have explored,” she wrote in an e-mail. “If, on the other hand, we continue the development of systems with decentralized control, user-configurable components, and open source code … then our new tech may be a source of great power for oppressed populations … those in power can lose their privacy and their monopoly on information through the same technology they might abuse to maintain power.”

Among other things, Castro teaches ethics, logic and moral theory in her philosophy classes at WSU.

For a long time after smartphones became ubiquitous, she declined to get one, fearing those same unforeseen consequences that Goad, Romig and Pendse warn about.

Then she got one. And loved it. And found unforeseen consequences. Addiction, for example.

Scholars look up things. Long ago they had to go to libraries, read books. There was a purity in that.

It meant work, effort — scholarship.

Now she has her smartphone, and finds herself “addicted.” To how easy searches are. To how easy “treasure hunts” are — not necessarily using information, but hunting for it. Addictive.

Her smartphone, she said, is “my sinister little crutch.”

She gave two interviews for this story, by the way. One was by telephone.

The other interview was done by e-mail, where she wrote the quotation a few paragraphs above, about “integrated technology” and “decentralized control.”

“Hope this helps!” she wrote.

“Sent Ironically from my phone.”

Subscribe to our newsletters

The Wichita Eagle welcomes your comments on news of the day. The more voices engaged in conversation, the better for us all, but do keep it civil. Please refrain from profanity, obscenity, spam, name-calling or attacking others for their views. Please see our commenting policy for more information.

Have a news tip? You can send it to wenews@wichitaeagle.com or consider joining the Public Insight Network and become a source for The Wichita Eagle.

Search for a job

in

Top jobs