PDA

View Full Version : New Surveillance and Personal Security issue (cameras on planes)



ramus
20th February 2019, 18:53
Like the article below I guess they think we're that stupid
-------------------------------------------------------------------------
Airplane seat cameras could be your new spy in the sky

Commentary: A camera trained on you for an entire long-haul flight? Surely you
can't be serious...

by
Claire Reilly

February 19, 2019 3:06 AM PST

https://www.cnet.com/news/airplane-seat-cameras-could-be-your-new-spy-in-the-sky
/

If Elon Musk is to be believed, the future of air travel involves superfast
flights on Big F--king Rockets, taking us from London to Tokyo in 37 minutes of
chrome-clad comfort.

But we are disgusting monsters and that is not the future we deserve.

In our timeline, the aircraft of the future will be a flying nightmare tube,
full of belching human meat sacks crammed cheek by jowl into rows of seats that
record our every movement.

At least that's the future that one Singapore Airlines passenger uncovered this
week.

Twitter user Vitaly Kamluk shared a photo of what looked like a camera installed
directly beneath the inflight entertainment screen on a Singapore Airlines
aircraft.
ADVERTISING

Singapore Airlines replied that it was indeed a camera, embedded into the seat
back by the original equipment manufacturers of the plane, but said the cameras
had been disabled on its aircraft and "there are no plans to develop any
features" using them.

And I think we can all agree, if a camera is built into a device, it will always
remain unused, always remain secure, and will never be used to capture footage.

Which is lucky because next time I'm flying the 23 hours it takes to get from
Sydney to literally anywhere, I want to rest assured that my metal hotbox of an
airplane isn't recording my dumb face as I horf down questionable casserole,
snore my way through three Mission Impossible films and then quietly sweat
through the incubation period of that particularly virulent strain of gastro I
picked up on my last Tokyo stopover.

Ever since the jet engine revolutionised flying in the '60s, we've been sold an
image of air travel as a high-flying world of luxury and dewy-faced women in
pearls and twinsets staring wistfully out windows.
Related Stories

How I went dark in the surveillance state for 2 years
In the future, not even your DNA will be sacred
China surveillance tech can ID people by their walk

But the reality is anything but. It's all pre-teens opening plastic packets of
candy to work noisily around their retainer-clad maws. It's mouth-breathers
hoiking their shoeless feet onto tray tables, wide-set passengers manspreading
into your leg space, the smell of reheated stew served from a drum, the uneasy
detente that follows 12 solid hours of silent fighting over an armrest.

And oh, the vomiting. The last time I took an international flight, the girl in
the row opposite me hurled within 20 minutes of take-off. Her poor mother
quietly wiped hot sick off her inflight entertainment screen, perhaps wondering
why she wasn't on some better flight to meet a fellow named Armando in Aruba.
sd-air-and-space-museum-2-of-51
Feast on so much aviation history at the San Diego Air and Space Museum
48 Photos

If the airlines of the future want to capture this kind of footage -- no doubt
after pulling some sort of "sorry, not sorry" when they eventually do activate
the cameras -- then have at it.

But you can bet that the "original equipment manufacturers" who installed these
cameras weren't trying to meet a growing demand for seat-cam footage of tired
plane passengers. We aren't livestreamed entertainment in this grim, dystopian
future: We are an audience to be marketed to, data to be mined and a captive set
of eyeballs to be coerced.

Will the cabin crew be taking notes about whether seat 64B is watching the
safety demonstration? Will the cameras track our gaze to see if we're watching
in-flight commercials? And what happens if we turn away?

Cameras are in our advertising billboards, in our smart home devices and on
every second street corner, tracking our movements and slowly building up a
picture of our lives in minute-by-minute real time. Add airplanes to the mix and
you have a terrifying new way to calculate your social credit score. What
happens on the way to Vegas doesn't stay in the air.

Air travel is changing. But it's going to be damn hard to replace visions of
Frank Sinatra singing "Come Fly With Me" with a 24-hour live stream of screaming
children hurling box casserole into their seat-back camera.

Singapore Airlines did not immediately respond to a request for comment.

ramus
20th February 2019, 18:57
If you believe this we need to talk, because I have some ocean front property in Arizona. I guess they think we're that stupid.

I worked in manufacturing for 30 years, they would not spend an extra dime on a part that they weren't going to use.
---------------------------------------------------------------------------------------------------------------------------------------------------------------
Google says the built-in microphone it never told Nest users about was 'never supposed to be a secret'

https://www.businessinsider.com/nest-microphone-was-never-supposed-to-be-a-secret-2019-2


In early February, Google announced that Assistant would work with its home
security and alarm system, Nest Secure.
The problem: Users didn't know a microphone existed on their Nest security
devices to begin with.
On Tuesday, a Google representative told Business Insider the company had
made an "error."
"The on-device microphone was never intended to be a secret and should have
been listed in the tech specs," the person said. "That was an error on our
part."

In early February, Google announced that its home security and alarm system Nest
Secure would be getting an update. Users, the company said, could now enable its
virtual-assistant technology, Google Assistant.

The problem: Nest users didn't know a microphone existed on their security
device to begin with.

The existence of a microphone on the Nest Guard, which is the alarm, keypad, and
motion-sensor component in the Nest Secure offering, was never disclosed in any
of the product material for the device.

On Tuesday, a Google representative told Business Insider the company had made
an "error."

"The on-device microphone was never intended to be a secret and should have been
listed in the tech specs," the person said. "That was an error on our part."
Nest Guard

Google says "the microphone has never been on and is only activated when users
specifically enable the option."

Read more: Google is reabsorbing Nest, the smart-home company it bought for $3.2
billion in 2014

It also said the microphone was originally included in the Nest Guard for the
possibility of adding new security features down the line, like the ability to
detect broken glass.

Still, even if Google included the microphone in its Nest Guard device for
future updates — like its Assistant integration — the news comes as consumers
have grown increasingly wary of major tech companies and their commitment to
consumer privacy.

For Google, the revelation is particularly problematic and brings to mind
previous privacy controversies, such as the 2010 incident in which the company
acknowledged that its fleet of Street View cars "accidentally" collected
personal data transmitted over consumers' unsecured WiFi networks, including
emails.

Google bought Nest — which was initially known for its smart thermostat device
— back in 2014 for $3.2 billion. It became a standalone company in 2015 when
Google reorganized as Alphabet, but in February 2018 it was brought back into
Google under the leadership of the head hardware exec Rick Osterloh.

Today, Nest offers a variety of Internet of Things products including smoke
detectors, video doorbells, and security cameras.

Valerie Villars
20th February 2019, 19:25
Like the article below I guess they think we're that stupid
-------------------------------------------------------------------------
Airplane seat cameras could be your new spy in the sky

Commentary: A camera trained on you for an entire long-haul flight? Surely you
can't be serious...

by
Claire Reilly

February 19, 2019 3:06 AM PST

https://www.cnet.com/news/airplane-seat-cameras-could-be-your-new-spy-in-the-sky
/

If Elon Musk is to be believed, the future of air travel involves superfast
flights on Big F--king Rockets, taking us from London to Tokyo in 37 minutes of
chrome-clad comfort.

But we are disgusting monsters and that is not the future we deserve.

In our timeline, the aircraft of the future will be a flying nightmare tube,
full of belching human meat sacks crammed cheek by jowl into rows of seats that
record our every movement.

At least that's the future that one Singapore Airlines passenger uncovered this
week.

Twitter user Vitaly Kamluk shared a photo of what looked like a camera installed
directly beneath the inflight entertainment screen on a Singapore Airlines
aircraft.
ADVERTISING

Singapore Airlines replied that it was indeed a camera, embedded into the seat
back by the original equipment manufacturers of the plane, but said the cameras
had been disabled on its aircraft and "there are no plans to develop any
features" using them.

And I think we can all agree, if a camera is built into a device, it will always
remain unused, always remain secure, and will never be used to capture footage.

Which is lucky because next time I'm flying the 23 hours it takes to get from
Sydney to literally anywhere, I want to rest assured that my metal hotbox of an
airplane isn't recording my dumb face as I horf down questionable casserole,
snore my way through three Mission Impossible films and then quietly sweat
through the incubation period of that particularly virulent strain of gastro I
picked up on my last Tokyo stopover.

Ever since the jet engine revolutionised flying in the '60s, we've been sold an
image of air travel as a high-flying world of luxury and dewy-faced women in
pearls and twinsets staring wistfully out windows.
Related Stories

How I went dark in the surveillance state for 2 years
In the future, not even your DNA will be sacred
China surveillance tech can ID people by their walk

But the reality is anything but. It's all pre-teens opening plastic packets of
candy to work noisily around their retainer-clad maws. It's mouth-breathers
hoiking their shoeless feet onto tray tables, wide-set passengers manspreading
into your leg space, the smell of reheated stew served from a drum, the uneasy
detente that follows 12 solid hours of silent fighting over an armrest.

And oh, the vomiting. The last time I took an international flight, the girl in
the row opposite me hurled within 20 minutes of take-off. Her poor mother
quietly wiped hot sick off her inflight entertainment screen, perhaps wondering
why she wasn't on some better flight to meet a fellow named Armando in Aruba.
sd-air-and-space-museum-2-of-51
Feast on so much aviation history at the San Diego Air and Space Museum
48 Photos

If the airlines of the future want to capture this kind of footage -- no doubt
after pulling some sort of "sorry, not sorry" when they eventually do activate
the cameras -- then have at it.

But you can bet that the "original equipment manufacturers" who installed these
cameras weren't trying to meet a growing demand for seat-cam footage of tired
plane passengers. We aren't livestreamed entertainment in this grim, dystopian
future: We are an audience to be marketed to, data to be mined and a captive set
of eyeballs to be coerced.

Will the cabin crew be taking notes about whether seat 64B is watching the
safety demonstration? Will the cameras track our gaze to see if we're watching
in-flight commercials? And what happens if we turn away?

Cameras are in our advertising billboards, in our smart home devices and on
every second street corner, tracking our movements and slowly building up a
picture of our lives in minute-by-minute real time. Add airplanes to the mix and
you have a terrifying new way to calculate your social credit score. What
happens on the way to Vegas doesn't stay in the air.

Air travel is changing. But it's going to be damn hard to replace visions of
Frank Sinatra singing "Come Fly With Me" with a 24-hour live stream of screaming
children hurling box casserole into their seat-back camera.

Singapore Airlines did not immediately respond to a request for comment.

If I hadn't seen the author's name posted, I would have sworn that Mike wrote that. I haven't laughed that hard in days. :thumbsup:

Ernie Nemeth
20th February 2019, 22:36
Bring a piece of tape to stick over the camera lens - or a bit of chewed up bubble gum...

seko
21st February 2019, 05:05
Bring a piece of tape to stick over the camera lens - or a bit of chewed up bubble gum...

That is what I was thinking about haha

petra
21st February 2019, 21:20
Lol, way to be gross as a solution, guys 😜

Not to go too off topic here but check these poor kids (link (https://www.telegraph.co.uk/news/2018/05/17/chinese-school-uses-facial-recognition-monitor-student-attention/)), they kind of have no choice in the matter. Edit: this is movement recording, not full video recording. Supposedly. Kind of misleading, but really I think it's worse. It's like the computer psychoanalyzes everyone, and they're just little kids.

ramus
23rd February 2019, 18:43
China is putting surveillance cameras in plenty of schools

https://www.abacusnews.com/digital-life/china-putting-surveillance-cameras-plenty-schools/article/3000524

--------------------------------------------------------------------------------------------
Once again pay attention to the way this is worded ...

Where US schools turn to facial recognition for safety, Chinese schools are doing it to manage students .. U.S. IS GOOD CHINA IS BAD ..
----------------------------------------------------------------------------------------------
Being watched by facial recognition cameras when walking around schools? That's not sci-fi anymore.

Responding to deadly shootings, US schools are turning to facial recognition technology to try and prevent them happening. Lockport City schools have been installing a facial recognition-enabled system that will supposedly detect dangerous people on campus -- and alert the police. Schools in Broward County are adding a surveillance system that can supposedly recognize unusual behavior (though not with facial recognition).

It’s not a surprise that reaction has been strong -- there are deep concerns about much tracking these systems do, the impact on privacy, what it means for the rights of the students being watched, and the security of the data generated.

Equally, it probably shouldn't be a surprise that surveillance in schools is common in China, where attitudes are very different.

In schools across China, facial recognition cameras are being installed in gates, canteens and even classrooms to watch over students. But it's said to be less about preventing crime and more about helping schools and teachers manage students.

Unsurprisingly, Chinese state media cheers the use of facial recognition as part of “using big data to improve life on campus”. There's a big push for smart campuses across the country, and increasing surveillance -- part of the country's massive SkyNet.
What is
Skynet?

Skynet is the Chinese government’s video surveillance system, which it claims is for tracking criminals. Under the project, more than 20 million cameras have been set up in public spaces across the country.


Skynet is the Chinese government’s video surveillance system, which it claims is for tracking criminals. Under the project, more than 20 million cameras have been set up in public spaces across the country.
Full story

Some schools use them to allow students to check in simply by scanning their face, like the prestigious Peking University and a number of high schools. Others set up facial recognition in canteens to let students pay for their meals that way.

Most of these might seem benign or even useful, but other schools have faced criticism for more invasive uses. One high school in Hangzhou faced a backlash on Chinese social media for using cameras to analyze the faces of students -- to see if they're dozing off in class.
Camera installed in a classroom at Hangzhou No.11 High School. (Picture: Sina)

A number of schools are also buying "intelligent uniforms", equipped with a chip to precisely track students -- and combined with a facial recognition system to ensure that someone hasn't just grabbed a friend's jacket for the day.

The company producing the uniforms says they designed them for the safety of students, and are now exporting them to other Asian countries like India.

But Chinese netizens are skeptical about the uniforms. One Zhihu user says, “If you’re designing the uniforms for the students’ safety, why are you planning to add an ‘anti-dozing-off’ function?”
What is
Zhihu?

Zhihu is China’s most popular Q&A platform, similar to Quora. Launched in 2010, it’s widely believed to have a better educated user base than other social media sites. But some say that’s starting to change, as Zhihu seeks...
Full story
What is
Zhihu?

Zhihu is China’s most popular Q&A platform, similar to Quora. Launched in 2010, it’s widely believed to have a better educated user base than other social media sites. But some say that’s starting to change, as Zhihu seeks...
Full story

ramus
26th February 2019, 14:48
As with all articles from the media there is an agenda: FEAR

(home assistants could soon report their owners to the police for breaking the law based on a “Moral A.I.” system,if the ideas of academics in Europe are implemented.)
But the core of the article is that there is tech that can do this.
--------------------------------------------------------------------------------------------------
Report: Home Assistants with ‘Moral AI’ Could Call Police on Owners


https://www.breitbart.com/tech/2019/02/25/report-home-assistants-with-moral-ai-could-call-police-on-owners/

The Daily Mail reported that home assistants could soon report their owners to the police for breaking the law based on a “Moral A.I.” system, if the ideas of academics in Europe are implemented.

The newspaper reported that academics at the University of Bergen in Norway discussed the idea of a “moral A.I.” for smart home assistants, like the Amazon Echo, Google Home, and Apple HomePod, during a conference.

Moral A.I. would reportedly make home assistants have to “decide whether to report their owners for breaking the law,” or whether to stay silent.

“This would let them to weigh-up whether to report illegal activity to the police, effectively putting millions of people under constant surveillance,” the Daily Mail explained, adding that Dr. Marija Slavkovik, who led the research, “suggested that digital assistants should possess an ethical awareness that simultaneously represents both the owner and the authorities — or, in the case of a minor, their parents.”

“Devices would then have an internal ‘discussion’ about suspect behaviour, weighing up conflicting demands between the law and personal freedoms, before arriving at the ‘best’ course of action,” the Mail noted.

In an interview with the Mail, Slavkovik declared, “There is [already] an ethical conflict between people in one family, let alone between people and manufacturer, or shareholders of the manufacturer and programmers… If we want to avoid Orwellian outcomes it’s important that all stakeholders are identified and have a say, including when machines shouldn’t be able to listen in. Right now only the manufacturer decides.”

Home assistants, most notably Google Home and Amazon Echo devices, have been at the center of privacy and security concerns since their release.

Amazon Echo devices have been known to scare owners by randomly laughing, and telling one crying woman, “It’s going to be OK,” after she lost her job.

One Amazon Echo device even recorded a family’s conversation before sending it to a random contact, while an “error” granted a German man access to another user’s 1,700 voice recordings.

A report last year also indicated that Amazon Echo devices can be hijacked.

This month, it was revealed that Google failed to disclose a “secret” microphone on its home security product Nest Secure.

The company’s failure to disclose the microphone was only discovered after Google announced that users “would now be able to use Google Assistant” on the security devices.
------------------------------------------------------

CAN ANYONE SAY ORWELLIAN

ramus
26th February 2019, 15:09
Here is another article on what could happen with the tech at hand .... FEAR anyone

Also look at the name of the company that has this tech ....Chinese tech giant Huawei .... has she been extradited yet?
-----------------------------------------------------------------------------------------------------------
Driverless Snitch? New Car Technology Could Call The Cops On You


https://sacramento.cbslocal.com/2019/02/25/autonomous-snitch-car-technology/

SACRAMENTO (CBS13) — Your car could one day snitch on you if you misbehave, with technology that could call the cops on you.

The company behind the technology says it will keep people safe, but is it an invasion of privacy?

Even if your car drives itself, you can still get busted for drinking, texting, or sleeping because if your car malfunctions, you have to be able to take over.

Police say the driver barreling down the 101 in Redwood City at 70 miles per hour on autopilot in November was drunk and asleep.

It was science fiction nearly three decades ago when California’s future governor was forced to take control of the robot taxi in Total Recall.

Now Chinese tech giant Huawei is working to keep driver’s honest by developing technology to detect if you’re drunk, tired, distracted, or have a bad case of road rage.

A camera trained on your face looks for clues that you’re impaired and listens for things like slurred speech. It can then lock the car’s controls or call the cops on you.

The CHP says it’s too early to weigh in on the concept, saying they don’t want to speculate about technology that might arrive in the future. But privacy advocates say constant surveillance is dangerous.

ALSO: Porch Pirates Would Face Jail Time If California Bill Passes

“It’s not just were you slurring your speech? It’s how fast are you driving and where were you going and where did you stop,” said Robin Swanson with Californians for Consumer Privacy.

That’s information they’re afraid could be sold.

“Once it leaves our hands that is the bigger question and that is where consumers are demanding the right to know,” Swanson said.

A new state law going into effect next year requires companies to tell consumers what personal information is being collected and who sees it. You will have the right to opt out as well.

TomKat
3rd January 2020, 03:55
I just want to relay a story I heard recently. This happened some time during the Obama administration. It was related by a retired soldier, special forces. One day he noticed a military drone flying over his yard. He took a picture with his phone. He put the picture on his computer. Later that day, he looked at the picture on his phone and the drone had been pixelated out of the picture. He went to his computer and the picture there was still good. Next day he looked at the picture on his computer and it too was pixilated out.

The moral of this story -- do not think your networked computer is safe from the military. If you take a picture of a UFO or anything the govt might not want you to have, take your computer offline while you download the picture to a USB stick or DVD, and write protect the media. And then print it on a colour laser printer (ink jet prints are destroyed if they get wet).

I also remember, years ago, talking to a county sheriff deputy whose job it was to catch people downloading child porn. I said something like "what if they just put it on a USB stick?" And he said, "I'll know if they put it on a USB stick," then he shut up, with a facial expression as if he'd said to much.

I have noticed that in some instances when Windows theoretically could not connect to Microsoft because the computer did not have an IP address, but was physically connected to a LAN cable, it nevertheless did connect and start downloading updates. I suspect all the OSes including Linux have backdoors for use by the govt.

onawah
22nd April 2020, 19:12
Pandemic drones to monitor fever, crowds from above
By Dan Krauth
Wednesday, April 15, 2020
https://abc7ny.com/6102905/

http://projectavalon.net/Pandemic_drones_to_monitor_fever_and_crowds_from_above.mp4
http://projectavalon.net/Pandemic_drones_to_monitor_fever_and_crowds_from_above.mp4

"ELIZABETH, New Jersey (WABC) -- As city and state leaders work to figure out how to reopen daily life safely, some places are looking to technology to help make that happen -- technology that could be hovering above us.

Cities like Elizabeth, New Jersey, are already deploying drones with automated voice messages reminding people to keep their distance. In Meriden, Connecticut, the mayor announced they'll be using them to monitor the city's trails and parks.

But some new drones that are under development now will be equipped not only with cameras, but high tech sensors that can help determine if people are sick or not social distancing down below.

"You''ll be seeing this very soon," said Cameron Chell, CEO of Draganfly, one of the oldest commercial drone companies.

He said they're deploying pilot drone programs across the country this month, including one in the Tri-State Area.

"Where it's most critically needed is where we're going," Chell said.

If the drones follow local and federal regulations, they can be operated just about anywhere as long as they hover within 20 stories of where people are located on the ground to pick up real time data.

"What these cameras can do is actually detect fever, which is very different than detecting just temperature," Chell said. "They can detect sneezing. They can detect your heart rate, your respiratory rate, and they can also detect social distancing. So imagine, if you will, a situation where there's a crowd, and you want to determine what's the infection rate of the crowd and if they are practicing social distancing. Is this a hot spot that is a problem?"

Daniel Schwarz, the privacy and technology strategist with the New York Civil Liberties Union, says there are concerns the technology could be used improperly.

"There can be a place for advanced technology to support health efforts during a crisis like this one, but it should always serve a clear public health purpose," he said in a statement. "Indefinite and unwarranted mass crowd policing does not fit that purpose. Surveillance tools used during the pandemic should be scientifically justified, communicated transparently to the public, limited in their scope and duration, and should always require informed consent. Constant aerial surveillance combined with biased analytics would fundamentally change what it feels like to venture out in public in this country, violate our constitutional rights to freedom of association and privacy, and open the door to expanded broken windows policing of communities uniquely vulnerable to COVID-19."

However, Chell said the technology is not created to identify specific people, only to help keep people safe and flatten the curve.

"As it stands today, it's not designed to identify people with the system," he said. "It's designed to basically provide health monitoring data and be able to give us better data but make more clear decisions."

In a Facebook post, the Elizabeth Police Department said it's using its drones to "save lives" not to be "big brother." "