Shadow
سرخیاں
پولینڈ: یوکرینی گندم کی درآمد پر کسانوں کا احتجاج، سرحد بند کر دیخود کشی کے لیے آن لائن سہولت، بین الاقوامی نیٹ ورک ملوث، صرف برطانیہ میں 130 افراد کی موت، چشم کشا انکشافاتپوپ فرانسس کی یک صنف سماج کے نظریہ پر سخت تنقید، دور جدید کا بدترین نظریہ قرار دے دیاصدر ایردوعان کا اقوام متحدہ جنرل اسمبلی میں رنگ برنگے بینروں پر اعتراض، ہم جنس پرستی سے مشابہہ قرار دے دیا، معاملہ سیکرٹری جنرل کے سامنے اٹھانے کا عندیامغرب روس کو شکست دینے کے خبط میں مبتلا ہے، یہ ان کے خود کے لیے بھی خطرناک ہے: جنرل اسمبلی اجلاس میں سرگئی لاوروو کا خطاباروناچل پردیش: 3 کھلاڑی چین اور ہندوستان کے مابین متنازعہ علاقے کی سیاست کا نشانہ بن گئے، ایشیائی کھیلوں کے مقابلے میں شامل نہ ہو سکےایشیا میں امن و استحکام کے لیے چین کا ایک اور بڑا قدم: شام کے ساتھ تذویراتی تعلقات کا اعلانامریکی تاریخ کی سب سے بڑی خفیہ و حساس دستاویزات کی چوری: انوکھے طریقے پر ادارے سر پکڑ کر بیٹھ گئےیورپی کمیشن صدر نے دوسری جنگ عظیم میں جاپان پر جوہری حملے کا ذمہ دار روس کو قرار دے دیااگر خطے میں کوئی بھی ملک جوہری قوت بنتا ہے تو سعودیہ بھی مجبور ہو گا کہ جوہری ہتھیار حاصل کرے: محمد بن سلمان

As if listening to you have sex wasn’t bad enough, Siri and Alexa can also be hijacked by LASERS, researchers find

As if listening to you have sex wasn’t bad enough, Siri and Alexa can also be hijacked by LASERS, researchers find

Voice-activated digital assistants can be remotely hijacked by lasers as far as 350 feet away and made to order products, start cars, and otherwise drive a smart-home owner crazy, researchers have discovered.

Google Home, Amazon’s Alexa, and Apple’s Siri can be remotely hijacked from hundreds of feet away with lasers pointed at their microphones, researchers at the University of Michigan and the University of Electro-Communications in Tokyo found. The takeover is instantaneous and silent – a well-placed command to turn the device’s volume down to zero would ensure that even its spoken responses could go unnoticed by its hapless owner.

//www.youtube.com/embed/iK2PtdQs77c

Researchers were able to open garage doors, crack “smart” locks, make online purchases, and even unlock and start vehicles using carefully aimed lasers. Any system connected to the device can be controlled through this relatively simple mode of attack. Because the microphones on voice assistants work by converting sound into electrical signals, encoding the same electrical signal into a laser beam produces an equivalent response to a particular voice command.

//www.youtube.com/embed/EtzP-mCwNAs

Using a telephoto lens to focus the laser, they were able to shanghai devices in other buildings – a Google Home coughed up the time from about 250 feet (75 meters) away with a laser projected diagonally downward at a 21-degree angle. Even with a low-power five-milliwatt laser, a Google Home and early-model Amazon Echo could be ordered around from nearly 360 feet (110 meters) away. The researchers emphasized the accessibility of the setup – any determined device-hacker could throw something together for a few hundred dollars with commercially available parts.

As the high-tech “smart home” is increasingly controlled by voice commands issued through devices like Google Home or Alexa, it becomes enormously susceptible to outside attacks – to say nothing of the surveillance possibilities. The researchers did the ethical thing and warned device manufacturers including Amazon, Apple, Google – even Tesla and Ford, whose cars could be remotely controlled – of the vulnerability, but it’s not the first flaw in these supposedly smart assistants, and it surely won’t be the last.

Also on rt.com

© Reuters / Fabrizio Bensch
Google buys Fitbit, acquiring users’ health histories & triggering privacy backlash

In 2017, Chinese researchers found that every voice assistant they tested could be hijacked with high-frequency commands pitched at frequencies over 20,000 Hz that are inaudible to humans. It’s not clear if that vulnerability was ever fixed – filtering out those frequencies may not be possible with current microphone design – but unlike the laser hack, the ultrasonic hack was only possible in close proximity to the device. Another attack uses commands camouflaged in other sounds that are incomprehensible to humans but easily understood by devices.

Meanwhile, even when the devices are working perfectly without interference by sound- and light-hackers, they are still piping owners’ intimate moments to teams of humans, whose official purpose is to evaluate the performance of the artificial intelligence powering the smart speakers. Those teams have infamously been caught swapping entertaining recordings among themselves. Amazon began letting users opt out of human review of Alexa recordings earlier this year, though Apple recently reinstated human review of Siri recordings as the default setting and Google only paused human review in some European countries.

Like this story? Share it with a friend!



As if listening to you have sex wasn’t bad enough, Siri and Alexa can also be hijacked by LASERS, researchers find

Source: RT

دوست و احباب کو تجویز کریں

Leave a Reply

Your email address will not be published. Required fields are marked *

10 + 15 =

Contact Us