
The Israeli military (IDF) has long earned a reputation for technical prowess and has previously made bold but unverified claims about the use of new technologies. Following the 11-day war in Gaza in May 2021, officials said Israel had waged its “first AI war” using machine learning and advanced computing, The Guardian reported.
Israel’s latest war with Hamas has provided an unprecedented opportunity for the IDF to use such tools in a much wider theater of operations, including the deployment of an artificial intelligence target creation platform called “Evanghelia,” which has greatly accelerated the production of lethal targets that officials say compared to the “factory “.
The Guardian is revealing new details about “Gospel” and its central role in Israel’s war on Gaza, interviewing intelligence sources and corroborating lesser-known claims by retired military and military officials.
The Israeli army did not hide the intensity of the bombing of the Gaza Strip. In the first days of the offensive, the head of his Air Force spoke of continuous airstrikes, “24 hours a day.” His forces only struck military targets, he said, but added: “We are not surgeons.”
However, relatively little attention has been paid to the methods used by the Israel Defense Forces (IDF) to select targets in Gaza and the role of artificial intelligence in their bombing campaign. And now, as Israel resumes its offensive after a seven-day ceasefire, there is growing concern about the Israeli military’s approach to targeting in the war against Hamas, which according to the Hamas-run Gaza Health Ministry has so far killed more than 15 thousands of people
A secret military intelligence unit
In fact, a secret military intelligence unit assisted by artificial intelligence is playing a significant role in Israel’s response to the massacre by Hamas in southern Israel on October 7.
The picture of how the Israeli military is using artificial intelligence is gradually emerging amid growing concern about the risks it poses to civilians, as advanced militaries around the world expand the use of complex and opaque automated systems on the battlefield. “Other countries will watch and learn,” said a former White House security official familiar with the US military’s use of autonomous systems.
In early November, the Israel Defense Forces said its target management unit had identified “more than 12,000” targets in Gaza. Describing the unit’s targeting process, one official said, “We work relentlessly to determine who and what the enemy is. Hamas fighters are not immune — no matter where they hide.”
The activities of the unit, which was formed in 2019 as part of the intelligence department of the IDF, are classified. However, a brief statement on the IDF website said it uses an artificial intelligence-based system called Habsora (Gospel) in the war against Hamas for “rapid target separation”.
The IDF said that “through rapid and automatic information extraction,” Evanghelia created targeted recommendations for its researchers “in order to obtain a complete match between the machine’s recommendation and the identification made by a human.” Essentially, the AI system is used to make automatic recommendations for attack targets, such as the private homes of people suspected of being agents of Hamas or Islamic Jihad.
100 goals per day
In recent years, the task force has helped the Israel Defense Forces build a database of what sources say are between 30,000 and 40,000 suspected militants. They say systems like Habsora have played a key role in compiling hit lists.
Aviv Kochavi, who served as Israel’s army chief until January, said the task force “works on the basis of artificial intelligence capabilities” and includes hundreds of officers and soldiers. In an interview published before the war, he said it was “a machine that produces vast amounts of data more efficiently than any human and turns them into targets for attack.”
According to Kochavi, “once this machine was activated” during Israel’s 11-day war with Hamas in May 2021, it generated 100 targets per day. “To put this in perspective, in the past we produced 50 targets in Gaza a year. Now this machine produces 100 targets a day, 50% of them are attacked,” the former official explained.
It is not known exactly what forms of data enter the artificial intelligence system. But experts say AI-based decision support systems for targeting typically analyze large sets of information from a range of sources, such as drone footage, intercepted communications, surveillance data and information gleaned from tracking the movements and behavior patterns of individuals and large groups
Why was the target unit created?
The Targets Division was created to address a chronic IDF problem: during previous operations in the Gaza Strip, the Air Force repeatedly ran short of targets to hit. As Hamas leaders disappeared into the tunnels at the start of any new offensive, sources said, systems like Evangeli allowed the IDF to locate and target a much larger group of junior operatives.
One official, who worked on targeting decisions during previous operations in Gaza, said the IDF had not previously bombed the homes of lower-ranking Hamas members. That has changed in the current conflict, with the homes of suspected Hamas fighters now being targeted regardless of rank.
“Estimated” for the probable number of civilian targets
In a brief statement by the IDF about its target unit, a senior official said the unit “delivers precision attacks on Hamas-linked infrastructure while inflicting heavy damage on the enemy and minimal damage to non-combatants.”
The accuracy of the hits recommended by the “artificial intelligence target bank” has been highlighted in several articles in the Israeli press. The daily Yedioth Ahronoth reported that the unit “ensures as much as possible that unaffected civilians are not harmed.”
The source, a former senior Israeli military official, told The Guardian that agents were using a “very precise” estimate of the speed at which civilians were evacuated from the building shortly before the attack. “We use an algorithm to estimate how many civilians are left. It gives us green, yellow, red, like a traffic light,” the source explained.
According to figures released by the IDF in November, Israel attacked 15,000 targets in Gaza in the first 35 days of the war, far exceeding the number of previous military operations in the densely populated coastal territory. By comparison, in the 2014 war, which lasted 51 days, the IDF hit between 5,000 and 6,000 targets.
Multiple sources said that when a strike was authorized on the private homes of individuals identified as Hamas or Islamic Jihad militants, target scouts knew in advance the number of civilians expected to be killed. The sources said each target had a collateral damage assessment file that indicated how many civilians could be killed in a strike.
A source who worked on strike planning for the IDF until 2021 said that “the decision to strike is made by the commander of the military unit.”
refute
However, artificial intelligence and armed conflict experts who spoke to The Guardian said they were skeptical of claims that AI-based systems would reduce harm to civilians by encouraging more precise targeting. A lawyer who advises governments on artificial intelligence and compliance with humanitarian law said there was “little empirical evidence” to support such claims. Others emphasized the visible impact of the bombings.
“Look at the physical landscape of Gaza,” said Richard Moyes, a researcher who runs Article 36, which campaigns to reduce gun harm. “We are witnessing a large-scale leveling of an urban area using heavy explosive weapons, so the claim that there is a precise and limited force is not supported by the facts,” says Moyes.
Sources familiar with how AI-based systems have been integrated into IDF operations said such tools have greatly accelerated the target generation process.
“We prepare targets automatically and work according to a checklist,” a source who used to work in the targets department told Israel’s +972/Local Call. “It really feels like a factory. We work quickly and do not have time to study the target in detail. There is an opinion that we are judged by how many goals we manage to create,” the source explained.
Another source told the publication that the Gospel allowed the IDF to run a “mass killing factory” where “the emphasis is on quantity, not quality.” According to them, the human eye “scans targets before each attack, but it doesn’t need to spend a lot of time with them.”
Moyes says that by relying on tools like Gospel, the commander “is handed a computer-generated list of targets” and “doesn’t necessarily know how the list was created and has no opportunity to question and adequately question the target “. recommendations”.
“There is a danger,” he added, “that when people begin to rely on these systems, they become cogs in a mechanized process and lose the ability to consider the risk of civilian harm in a meaningful way.”
Source: News.ro
Source: Hot News

Ashley Bailey is a talented author and journalist known for her writing on trending topics. Currently working at 247 news reel, she brings readers fresh perspectives on current issues. With her well-researched and thought-provoking articles, she captures the zeitgeist and stays ahead of the latest trends. Ashley’s writing is a must-read for anyone interested in staying up-to-date with the latest developments.