Tuesday, August 20, 2024

In Screen We Trust - What Happens When the Data Dimension is Reduced August 20, 2024




We believe in screens” is a motto. This was originally a paraphrase of an old motto from America or somewhere else, “We believe in God. In short, we believe in the screen.

This has several adverse effects. First, it is easy to be impersonated on the Internet and in cyberspace. The reason is that before we can communicate with an individual or a communication or authentication unit entity, we only have visual and auditory information, and since that is only visual and auditory information through a machine, the dimension of the feature is inevitably lower.



The reason people often say that the center of security is people, or that excessive automation and mechanization are dangerous, is because the dimension of the profile becomes lower. For example, in AI-infused weaponry and security in general, excessive mechanization and automation lowers the dimension of the profile, and the human intuition and ability to sniff out suspiciousness is lost. We're forced to use low-dimensional data for authentication and anomaly detection, which makes us vulnerable to spoofing.



Some examples. A long time ago, a 25-year-old US Navy analyst made up a story about a fictional woman named “Robin Sage” on a social networking site. She made up “Robin Sage” in a place she had the misfortune to visit, setting it up as “I have a degree from MIT,” and she got over 300 friend requests from people in key positions in the military and defense industry. For example, they included the Chairman of the Joint Chiefs of Staff and the Chief of Intelligence Management.



Men are especially eager to show off to women, and they apparently ended up sending data and confidantial documents about their secret bases. In reality, there was no “Robin Sage,” but because they only saw the screen, they were fooled by the low-dimensional information.



Another well-known case is that of “Stuxnet”. Stuxnet is a malware designed to damage centrifuges used in Iran's nuclear development facilities. The malware infiltrated the facility's computer systems and deranged the operation of the centrifuges while displaying on-screen indications to the operators as if no anomalies had occurred. In reality, the centrifuges had been sabotaged, but appeared to be operating normally on the screen, so the operators were slow to notice the anomaly. This was also a result of over-reliance on the information on the screen.



Another example of a “flash crash” occurred on May 6, 2010, when stock prices in the U.S. stock market suddenly plummeted and then quickly recovered. This flash crash was caused by high-frequency trading (HFT), a type of algorithmic trading in which stocks are bought and sold at very high speeds, with large volumes of trades being executed in response to slight fluctuations in the stock price. On this particular day, when a sell order was placed in the market, the algorithms reacted en masse to accelerate the selling, causing the entire market to panic. As a result, the machine failed to detect the fatal bug.



As automation increases, it becomes easier to be caught in the act of spoofing and visual deception because only low-dimensional visual information can be used for authentication and detection. This means that when humans are no longer central, detection of high-dimensional, fine-grained data is no longer possible.

No comments:

Post a Comment