Digital Assassination: AI-Enabled Targeting Of Hamas Commanders

Digital Assassination: AI-Enabled Targeting Of Hamas Commanders

Click to see the full-size image

Originally published by IslamicWorldNews

Following the October 7 attack, the Israeli regime initiated the extensive deployment of artificial intelligence in its military operations, including integration with facial-recognition systems to identify wounded combatants, audio-analysis of communications to geolocate targets, and Arabic-language natural-language processing to monitor messages and social-media traffic. These technologies have been employed in the targeted assassination of Hamas commanders and the recovery of hostages, and have been incrementally refined over time.

According to four Israeli officials, following the October 7 attack on the occupied territories—which resulted in the death of over 1,200 Israelis and the taking of 250 hostages—artificial intelligence technologies were rapidly approved for field use. According to them, this led to cooperation between Unit 8200 and reserve forces in the “Studio” to swiftly develop new AI capabilities.

Over the past 18 months, the occupying regime has combined artificial intelligence with facial recognition software to match partially covered or injured faces with real identities. It has also used AI to compile lists of potential airstrike targets and developed an Arabic-language AI model to power a chatbot capable of scanning and analyzing text messages, social media posts, and other Arabic-language data.

In late 2023, the Zionist regime was attempting to assassinate Ibrahim Biari, the commander of Hamas’s Jabalia Central Battalion, who was allegedly involved in planning the October 7 attack. The regime’s military intelligence intercepted Biari’s communications with other Hamas members but was unable to pinpoint his exact location. Therefore, they turned to an AI-powered audio tool—one that analyzed various sounds such as sonic booms and airstrikes.
After the approximate location of Ibrahim Biari’s communications was identified, Zionist military officials were warned that the area—which included several apartment complexes—was highly populated. They stated that to ensure Biari’s death, multiple buildings needed to be targeted in the airstrike. The operation was approved and carried out.

Since then, the regime’s military intelligence has also used this audio tool—alongside maps and images of Gaza’s complex underground tunnels—to search for hostages. According to two Israeli officers, the tool has improved over time to identify individuals with greater accuracy.

Tracking the location of individuals

According to three informed Israeli and American officials (whose names were not disclosed in the report), Israeli officers turned to a new military technology integrated with artificial intelligence to track their targets. This technology had been developed about a decade ago but had not yet been used on the battlefield. The effort to locate Martyr Ibrahim Biari provided fresh motivation to enhance the tool. Engineers from Unit 8200—the occupying regime’s equivalent of the U.S. National Security Agency—rapidly integrated AI into the system.

Shortly afterward, the regime’s security personnel intercepted Mr. Biari’s phone calls and tested the AI-based audio tool, which identified the approximate location of his communications. Using this information, the Zionist regime ordered an airstrike on the area on October 31, 2023, resulting in the martyrdom of Ibrahim Biari.

According to a report by Airwars, a London-based conflict monitoring organization, due to the approximate nature of the AI-generated location, more than 125 civilians were also killed in the strike! Based on interviews with nine American and Israeli defense officials, the audio tool is just one example that demonstrates how the regime has used the Gaza war to test and rapidly deploy AI-supported military technologies on an unprecedented scale.

Behind-the-Scenes Companies and Reactions to the Committed Atrocities

Many of these efforts resulted from collaboration between active-duty soldiers in Unit 8200 and reservists working at tech companies such as Google, Microsoft, and Meta. According to these individuals, Unit 8200 has established a central hub called the “Studio,” which serves as an innovation center and a place to connect experts with AI projects. Reservists from the aforementioned major companies are involved in this unit. The Studio’s primary mission is to rapidly adapt artificial intelligence technologies to meet military needs.

While the regime rapidly continued to develop its artificial intelligence arsenal, the deployment of these technologies has led to mistaken identifications, wrongful arrests, and civilian deaths. According to European and American defense officials, no other power has been as active as the occupying regime in testing AI tools in real battles. This situation provides an initial glimpse into how such technologies might be used in future wars, as well as a warning of how these technologies could malfunction or be misused, resulting in the killing of civilians.

Meta and Microsoft declined to comment on the killings carried out in the Gaza war using artificial intelligence, but Google, in an effort to clear itself of accusations, stated: “We have employees who serve as reservists for the regime in various countries around the world. The work these employees do as reservists is not related to Google.”

Avi Hasson, CEO of the nonprofit organization Startup Nation Central, which connects investors to companies in the occupied territories, said, “Reservists from Meta, Google, and Microsoft have played a vital role in advancing innovation in drones and data integration.” He added, “Reservists brought specialized knowledge and access to key technologies that were not available in the military.”

Digital Assassination: AI-Enabled Targeting Of Hamas Commanders

Avi Hasson

Artificial Intelligence in the Drone Manufacturing Industry

The Israeli regime’s military soon began using artificial intelligence to upgrade its drone fleet. Aviv Shapira, founder and CEO of XTEND—a software and drone company collaborating with the regime’s military—said that AI-based algorithms have been used to develop drones capable of locking onto and tracking targets from a distance. “In the past, targeting relied on focusing on the image of the target. But now, AI can identify and track the object itself—whether it’s a moving vehicle or a person—with very high accuracy,” he said. Shapira added that his main clients—the occupying regime’s military and the U.S. Department of Defense—are aware of the ethical implications of using AI in warfare and are engaging in discussions about the responsible use of this technology.

Digital Assassination: AI-Enabled Targeting Of Hamas Commanders

Aviv Shapira

The Arabic Language: A Battleground for the Enemy to Influence Public Opinion

According to three Israeli officers, one of the tools developed by the “Studio” was an Arabic-language artificial intelligence model, classified as a large language model. (This LLM was previously reported by the news site +972.) Developers had previously faced difficulties in creating such a model due to the lack of sufficient Arabic-language data to train the technology. Even when such data was available, it was mostly in standard written Arabic, which is far more formal than the dozens of different dialects used in spoken Arabic.

According to Israeli officers, the occupying regime’s military did not face this problem. The illegitimate government had access to decades’ worth of intercepted text messages, recorded phone conversations, and social media posts collected in colloquial Arabic. As a result, in the first few months of the Gaza war, Zionist elements developed the large language model and designed a chatbot capable of handling queries in Arabic. They integrated this tool with multimedia databases, enabling Zionist analysts to conduct complex searches across images and videos.

When the Zionist regime assassinated Sayyed Hassan Nasrallah in September, the developed chatbot analyzed reactions to his martyrdom across the Arabic-speaking world. The technology distinguished between different dialects within Lebanon to assess public sentiment and helped the regime determine whether there was pressure from public opinion to carry out a retaliatory strike.

It is also reported that, at times, the chatbot was unable to recognize certain modern slang terms and words that had been phonetically transliterated from English into Arabic. One of the officers stated that this required Zionist intelligence officers specialized in various dialects to review and correct the results.

The chatbot also occasionally produced incorrect results. For example, instead of returning an image of a weapon, it would display a picture of a pipe. Nevertheless, they stated that the AI tool significantly accelerated the process of research and analysis.

Facial Recognition and Identification of Individuals

At temporary checkpoints set up between northern and southern Gaza following the October 7 attacks, the regime also deployed cameras capable of capturing high-resolution images of Palestinians and sending them to an AI-based facial recognition program. According to two intelligence officers from the regime, the system sometimes struggled to identify individuals whose faces were covered. This led to the detention and interrogation of Palestinians who had been mistakenly identified by the facial recognition system.

The regime also used artificial intelligence to process and refine data collected by intelligence officials about Hamas members. Prior to the war, the regime had developed a machine learning algorithm called “Lavender,” designed to rapidly sort through data and assist in identifying low-level militants. This algorithm was trained on a database of confirmed Hamas members and aimed to predict who else might be affiliated with the group.

According to reports, “Lavender” helped the occupying army compile a list of 37,000 human targets based on their association with Hamas. Although the system’s predictions were not flawless, the regime used it at the beginning of the Gaza war to assist in selecting targets for attack.


MORE ON THE TOPIC:

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
kotromanic

so basicaly any former idf member who works for a tech company is a spy that will hand the companies secrets over to the unit 8200 and help isreal while employed by the tech company?

hash
hashed