Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.


Israel has reportedly been using AI to guide its war in Gaza — and treating its decisions almost as gospel. In fact, one of the AI systems being used is literally called “The Gospel.”

According to a major investigation published last month by the Israeli outlet +972 Magazine, Israel has been relying on AI to decide whom to target for killing, with humans playing an alarmingly small role in the decision-making, especially in the early stages of the war. The investigation, which builds on a previous exposé by the same outlet, describes three AI systems working in concert.

“Gospel” marks buildings that it says Hamas militants are using. “Lavender,” which is trained on data about known militants, then trawls through surveillance data about almost everyone in Gaza — from photos to phone contacts — to rate each person’s likelihood of being a militant. It puts those who get a higher rating on a kill list. And “Where’s Daddy?” tracks these targets and tells the army when they’re in their family homes, an Israeli intelligence officer told +972, because it’s easier to bomb them there than in a protected military building.

The result? According to the Israeli intelligence officers interviewed by +972, some 37,000 Palestinians were marked for assassination, and thousands of women and children have been killed as collateral damage because of AI-generated decisions. As +972 wrote, “Lavender has played a central role in the unprecedented bombing of Palestinians,” which began soon after Hamas’s deadly attacks on Israeli civilians on October 7.

The use of AI could partly explain the high death toll in the war — at least 34,735 killed to date — which has sparked international criticism of Israel and even charges of genocide before the International Court of Justice.

Although there is still a “human in the loop” — tech-speak for a person who affirms or contradicts the AI’s recommendation — Israeli soldiers told +972 that they essentially treated the AI’s output “as if it were a human decision,” sometimes only devoting “20 seconds” to looking over a target before bombing, and that the army leadership encouraged them to automatically approve Lavender’s kill lists a couple weeks into the war. This was “despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases,” according to +972.

The Israeli army denied that it uses AI to select human targets, saying instead that it has a “database whose purpose is to cross-reference intelligence sources.” But UN Secretary-General Antonio Guterres said he was “deeply troubled” by the reporting, and White House national security spokesperson John Kirby said the US was looking into it.

How should the rest of us think about AI’s role in Gaza?

While AI proponents often say that technology is neutral (“it’s just a tool”) or even argue that AI will make warfare more humane (“it’ll help us be more precise”), Israel’s reported use of military AI arguably shows just the opposite.

“Very often these weapons are not used in such a precise manner,” Elke Schwarz, a political theorist at Queen Mary University of London who studies the ethics of military AI, told me. “The incentives are to use the systems at large scale and in ways that expand violence rather than contract it.”

Schwarz argues that our technology actually shapes the way we think and what we come to value. We think we’re running our tech, but to some degree, it’s running us. Last week, I spoke to her about how military AI systems can lead to moral complacency, prompt users toward action over non-action, and nudge people to prioritize speed over deliberative ethical reasoning. A transcript of our conversation, edited for length and clarity, follows.

Sigal Samuel

Were you surprised to learn that Israel has reportedly been using AI systems to help direct its war in Gaza?

Elke Schwarz

No, not at all. There have been reports for years saying that it’s very likely that Israel has AI-enabled weapons of various kinds. And they’ve made it quite clear that they’re developing these capabilities and considering themselves as one of the most advanced digital military forces globally, so there’s no secret around this pursuit.

Systems like Lavender or even Gospel are not surprising because if you just look at the US’s Project Maven [the Defense Department’s flagship AI project], that started off as a video analysis algorithm and now it’s become a target recommendation system. So, we’ve always thought it was going to go in that direction and indeed it did.

Sigal Samuel

One thing that struck me was just how uninvolved the human decision-makers seem to be. An Israeli military source said he would devote only about “20 seconds” to each target before authorizing a bombing. Did that surprise you?

Elke Schwarz

No, that didn’t either. Because the conversation in militaries over the last five years was that the idea is to accelerate the “kill chain” — to use AI to increase the fatality. The phrase that’s always used is “to shorten the sensor-to-shooter timeline,” which basically means to make it really fast from the input to when some weapon gets fired.

The allure and the attraction of these AI systems is that they operate so fast, and at such vast scales, suggesting many, many targets within a short period of time. So that the human just kind of becomes an automaton that presses the button and is like, “Okay, I guess that looks right.”

Defense publications have always said Project Convergence, another US [military] program, is really designed to shorten that sensor-to-shooter timeline from minutes to seconds. So having 20 seconds fits quite clearly into what has been reported for years.

Sigal Samuel

For me, this brings up questions about technological determinism, the idea that our technology determines how we think and what we value. As the military scholar Christopher Coker once said, “We must choose our tools carefully, not because they are inhumane (all weapons are) but because the more we come to rely on them, the more they shape our view of the world.”

You wrote something reminiscent of that in a 2021 paper: “When AI and human reasoning form an ecosystem, the possibility for human control is limited.” What did you mean by that? How does AI curtail human agency or reshape us as moral agents?

Elke Schwarz

In a number of ways. One is about the cognitive load. With all the data that is being processed, you kind of have to place your trust in the machine’s decision. First, because we don’t know what data is gathered and exactly how it then applies to the model. But also, there’s a cognitive disparity between the way the human brain processes things and the way an AI system makes a calculation. This leads to what we call “automation bias,” which is basically that as humans we tend to defer to the machines’ authority, because we assume that they’re better, faster, and cognitively more powerful than us.

Another thing is situational awareness. What is the data that is incoming? What is the algorithm? Is there a bias in it? These are all questions that an operator or any human in the loop should have knowledge about but mostly don’t have knowledge about, which then limits their own situational awareness about the context over which they should have oversight. If everything you know is presented to you on a screen of data and points and graphics, then you take that for granted, but your own sense of what the situation is on the battlefield becomes very limited.

And then there’s the element of speed. AI systems are simply so fast that we don’t have enough [mental] resources to not take what they’re suggesting as a call to action. We don’t have the wherewithal to intervene on the grounds of human reasoning. It’s like how your phone is designed in a way that makes you feel like you need to react — like, when a red dot pops up in your email, your first instinct is to click on it, not to not click on it! So there’s a tendency to prompt users toward action over non-action. And the fact is that if a binary choice is presented, kill or not kill, and you’re in a situation of urgency, you’re probably more likely to act and release the weapon.

Sigal Samuel

How does this relate to what the philosopher Shannon Vallor calls “moral de-skilling” — her term for when technology negatively affects our moral cultivation?

Elke Schwarz

There’s an inherent tension between moral deliberation, or thinking about the consequences of our actions, and the mandate of speed and scale. Ethics is about deliberation, about taking the time to say, “Are these really the parameters we want, or is what we’re doing just going to lead to more civilian casualties?”

If you’re not given the space or the time to exercise these moral ideas that every military should have and does normally have, then you’re becoming an automaton. You’re basically saying, “I’m part of the machine. Moral calculations happen somewhere prior by some other people, but it’s no longer my responsibility.”

Sigal Samuel

This ties into another thing I’ve been wondering about, which is the question of intent. In international law contexts like the genocide trial against Israel, showing intent among human decision-makers is key. But how should we think about intent when decisions are outsourced to AI? If tech reshapes our cognition, does it become harder to say who is morally responsible for a wrongful act in war that was recommended by an AI system?

Elke Schwarz

There’s one objection that says, well, humans are always somewhere in the loop, because they’re at least making the decision to use these AI systems. But that’s not the be-all, end-all of moral responsibility. In something as morally weighty as warfare, there are multiple nodes of responsibility — there are lots of morally problematic points in the decision-making.

And when you have a system that distributes the intent, then with any subsystem, you have plausible deniability. You can say, well, our intent was this, then the AI system does that, and the outcome is what you see. So it’s hard to attribute intent and that makes it very, very challenging. The machine doesn’t give interviews.

Sigal Samuel

Since AI is a general-purpose technology that can be used for a multitude of purposes, some beneficial and some harmful, how can we try to foretell where AI is going to do more harm than good and try to prevent those uses?

Elke Schwarz

Every tool can be refashioned to become a weapon. If you’re vicious enough, even a pillow can be a weapon. You can kill somebody with a pillow. We’re not going to prohibit all pillows. But if the trajectory in society is such that it seems there’s a tendency to use pillows for nefarious purposes, and access to pillows is really easy, and in fact some people are designing pillows that are made for smothering people, then yes, you should ask some questions!

That requires paying attention to society, its trends and its tendencies. You can’t bury your head in the sand. And at this point, there are enough reports out there about the ways in which AI is used for problematic purposes.

People say all the time that AI will make warfare more ethical. It was the claim with drones, too — that we have surveillance, so we can be a lot more precise, and we don’t have to throw cluster bombs or have a large air campaign. And of course there’s something to that. But very often these weapons are not used in such a precise manner.

Making the application of violence a lot easier actually lowers the threshold to the use of violence. The incentives are to use the systems at large scale and in ways that expand violence rather than contract it.

Sigal Samuel

That was what I found most striking about the +972 investigations — that instead of contracting violence, Israel’s alleged AI systems expanded it. The Lavender system marked 37,000 Palestinians as targets for assassination. Once the army has the technological capacity to do that, the soldiers come under pressure to keep up with it. One senior source told +972: “We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly.”

Elke Schwarz

It’s kind of a capitalist logic, isn’t it? It’s the logic of the conveyor belt. It says we need more — more data, more action. And if that is related to killing, it’s really problematic.

<div class="c-article-footer c-article-footer-cta" data-cid="site/article_footer-1715162416_8127_209202" data-cdata="{"base_type":"Entry","id":23915478,"timestamp":1715162400,"published_timestamp":1715162400,"show_published_and_updated_timestamps":false,"title":"Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.","type":"Article","url":"https://www.vox.com/future-perfect/24151437/ai-israel-gaza-war-hamas-artificial-intelligence","entry_layout":{"key":"unison_standard","layout":"unison_main","template":"standard"},"additional_byline":null,"authors":[{"id":5505183,"name":"Sigal Samuel","url":"https://www.vox.com/authors/sigal-samuel","twitter_handle":"","profile_image_url":"https://cdn.vox-cdn.com/thumbor/XiHZthlyb1SBS6_fqcxxfw6cAWQ=/512×512/cdn.vox-cdn.com/author_profile_images/191731/Screen_Shot_2019-02-07_at_2.03.15_PM.0.png","title":"","email":"","short_author_bio":"is a senior reporter for Vox’s Future Perfect and co-host of the Future Perfect podcast. She writes primarily about the future of consciousness, tracking advances in artificial intelligence and neuroscience and their staggering ethical implications. Before joining Vox, Sigal was the religion editor at the Atlantic."}],"byline_enabled":true,"byline_credit_text":"By","byline_serial_comma_enabled":true,"comment_count":0,"comments_enabled":false,"legacy_comments_enabled":false,"coral_comments_enabled":false,"coral_comment_counts_enabled":false,"commerce_disclosure":null,"community_name":"Vox","community_url":"https://www.vox.com/","community_logo":"rnrn rn vox-markrn rn rn rn rn rn”,”cross_community”:false,”internal_groups”:[{“base_type”:”EntryGroup”,”id”:112403,”timestamp”:1715162408,”title”:”Approach — Dissects something complicated”,”type”:”SiteGroup”,”url”:””,”slug”:”approach-dissects-something-complicated”,”community_logo”:”rnrn rn vox-markrn rn rn rn rn rn”,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:664,”always_show”:false,”description”:””,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”}],”image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73334160/girl_gaza_GettyImages_1815427912.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”A December 2023 photo shows a Palestinian girl injured as a result of the Israeli bombing on Khan Yunis in the southern Gaza Strip.”,”credit”:”Saher Alghorra/Middle East images/AFP via Getty Images”,”focal_area”:{“top_left_x”:2520,”top_left_y”:1520,”bottom_right_x”:3480,”bottom_right_y”:2480},”bounds”:[0,0,6000,4000],”uploaded_size”:{“width”:6000,”height”:4000},”focal_point”:null,”image_id”:73334160,”alt_text”:”An injured girl with a scarf on her head holds up her hand as she steps out of the passenger seat of a van.”},”hub_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73334160/girl_gaza_GettyImages_1815427912.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”A December 2023 photo shows a Palestinian girl injured as a result of the Israeli bombing on Khan Yunis in the southern Gaza Strip.”,”credit”:”Saher Alghorra/Middle East images/AFP via Getty Images”,”focal_area”:{“top_left_x”:2520,”top_left_y”:1520,”bottom_right_x”:3480,”bottom_right_y”:2480},”bounds”:[0,0,6000,4000],”uploaded_size”:{“width”:6000,”height”:4000},”focal_point”:null,”image_id”:73334160,”alt_text”:”An injured girl with a scarf on her head holds up her hand as she steps out of the passenger seat of a van.”},”lede_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”A December 2023 photo shows a Palestinian girl injured as a result of the Israeli bombing on Khan Yunis in the southern Gaza Strip.”,”credit”:”Saher Alghorra/Middle East images/AFP via Getty Images”,”focal_area”:{“top_left_x”:2520,”top_left_y”:1520,”bottom_right_x”:3480,”bottom_right_y”:2480},”bounds”:[0,0,6000,4000],”uploaded_size”:{“width”:6000,”height”:4000},”focal_point”:null,”image_id”:73334162,”alt_text”:”An injured girl with a scarf on her head holds up her hand as she steps out of the passenger seat of a van.”},”group_cover_image”:null,”picture_standard_lead_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”A December 2023 photo shows a Palestinian girl injured as a result of the Israeli bombing on Khan Yunis in the southern Gaza Strip.”,”credit”:”Saher Alghorra/Middle East images/AFP via Getty Images”,”focal_area”:{“top_left_x”:2520,”top_left_y”:1520,”bottom_right_x”:3480,”bottom_right_y”:2480},”bounds”:[0,0,6000,4000],”uploaded_size”:{“width”:6000,”height”:4000},”focal_point”:null,”image_id”:73334162,”alt_text”:”An injured girl with a scarf on her head holds up her hand as she steps out of the passenger seat of a van.”,”picture_element”:{“loading”:”eager”,”html”:{},”alt”:”An injured girl with a scarf on her head holds up her hand as she steps out of the passenger seat of a van.”,”default”:{“srcset”:”https://cdn.vox-cdn.com/thumbor/ZaY0ntpVZ0w17Yf1dF5tw3Ce35k=/0x0:6000×4000/320×240/filters:focal(2520×1520:3480×2480)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 320w, https://cdn.vox-cdn.com/thumbor/FpVmYtWbTH6jrDFnHSJbRj2cOjg=/0x0:6000×4000/620×465/filters:focal(2520×1520:3480×2480)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 620w, https://cdn.vox-cdn.com/thumbor/_JvNpwbX59nX1iG6kV7oCtIGph0=/0x0:6000×4000/920×690/filters:focal(2520×1520:3480×2480)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 920w, https://cdn.vox-cdn.com/thumbor/oc6Ip-R_QC0vdGLyooOeKosYNRA=/0x0:6000×4000/1220×915/filters:focal(2520×1520:3480×2480)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 1220w, https://cdn.vox-cdn.com/thumbor/oIjlwSvhYRqMvHowOGpuzSEXsG0=/0x0:6000×4000/1520×1140/filters:focal(2520×1520:3480×2480)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 1520w”,”webp_srcset”:”https://cdn.vox-cdn.com/thumbor/_ffQRa4xrhXLcK7WdlgLTKsEjPM=/0x0:6000×4000/320×240/filters:focal(2520×1520:3480×2480):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 320w, https://cdn.vox-cdn.com/thumbor/cYXF7ADqB88j9SQnECGeM4AWUbM=/0x0:6000×4000/620×465/filters:focal(2520×1520:3480×2480):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 620w, https://cdn.vox-cdn.com/thumbor/6w7JiPYRXYf6cf_j3SajrLbtZxA=/0x0:6000×4000/920×690/filters:focal(2520×1520:3480×2480):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 920w, https://cdn.vox-cdn.com/thumbor/rglFdASP3zV53eobz0lFBU_mvJg=/0x0:6000×4000/1220×915/filters:focal(2520×1520:3480×2480):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 1220w, https://cdn.vox-cdn.com/thumbor/6lcG1Rm7fhqgX4DRWPh-Um-0yG8=/0x0:6000×4000/1520×1140/filters:focal(2520×1520:3480×2480):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg 1520w”,”media”:null,”sizes”:”(min-width: 809px) 485px, (min-width: 600px) 60vw, 100vw”,”fallback”:”https://cdn.vox-cdn.com/thumbor/Jeba85_0h1PlVaAR0rFK5quKaIg=/0x0:6000×4000/1200×900/filters:focal(2520×1520:3480×2480)/cdn.vox-cdn.com/uploads/chorus_image/image/73334162/girl_gaza_GettyImages_1815427912.0.jpg”},”art_directed”:[]}},”image_is_placeholder”:false,”image_is_hidden”:false,”network”:”vox”,”omits_labels”:false,”optimizable”:false,”promo_headline”:”Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.”,”recommended_count”:0,”recs_enabled”:false,”slug”:”future-perfect/24151437/ai-israel-gaza-war-hamas-artificial-intelligence”,”dek”:”AI nudges us to prioritize speed and scale. In Gaza, it’s turbocharging mass bombing. “,”homepage_title”:”Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.”,”homepage_description”:”AI nudges us to prioritize speed and scale. In Gaza, it’s turbocharging mass bombing.”,”show_homepage_description”:false,”title_display”:”Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.”,”pull_quote”:null,”voxcreative”:false,”show_entry_time”:true,”show_dates”:true,”paywalled_content”:false,”paywalled_content_box_logo_url”:””,”paywalled_content_page_logo_url”:””,”paywalled_content_main_url”:””,”article_footer_body”:”We believe that everyone deserves to understand the world that they live in. That kind of knowledge helps create better citizens, neighbors, friends, parents, and stewards of this planet. Producing deeply researched, explanatory journalism takes resources. You can support this mission by making a financial gift to Vox today. Will you join us? rn”,”article_footer_header”:”Will you support Vox today?”,”use_article_footer”:true,”article_footer_cta_annual_plans”:”{rn “default_plan”: 1,rn “plans”: [rn {rn “amount”: 50,rn “plan_id”: 99546rn },rn {rn “amount”: 100,rn “plan_id”: 99547rn },rn {rn “amount”: 150,rn “plan_id”: 99548rn },rn {rn “amount”: 200,rn “plan_id”: 99549rn }rn ]rn}”,”article_footer_cta_button_annual_copy”:”year”,”article_footer_cta_button_copy”:”Yes, I’ll give”,”article_footer_cta_button_monthly_copy”:”month”,”article_footer_cta_default_frequency”:”monthly”,”article_footer_cta_monthly_plans”:”{rn “default_plan”: 0,rn “plans”: [rn {rn “amount”: 5,rn “plan_id”: 99543rn },rn {rn “amount”: 10,rn “plan_id”: 99544rn },rn {rn “amount”: 25,rn “plan_id”: 99545rn },rn {rn “amount”: 50,rn “plan_id”: 46947rn }rn ]rn}”,”article_footer_cta_once_plans”:”{rn “default_plan”: 0,rn “plans”: [rn {rn “amount”: 20,rn “plan_id”: 69278rn },rn {rn “amount”: 50,rn “plan_id”: 48880rn },rn {rn “amount”: 100,rn “plan_id”: 46607rn },rn {rn “amount”: 250,rn “plan_id”: 46946rn }rn ]rn}”,”use_article_footer_cta_read_counter”:true,”use_article_footer_cta”:true,”groups”:[{“base_type”:”EntryGroup”,”id”:76815,”timestamp”:1715162408,”title”:”Future Perfect”,”type”:”SiteGroup”,”url”:”https://www.vox.com/future-perfect”,”slug”:”future-perfect”,”community_logo”:”rnrn rn vox-markrn rn rn rn rn rn”,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:1862,”always_show”:false,”description”:”Finding the best ways to do good. “,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:”https://cdn.vox-cdn.com/uploads/chorus_asset/file/16290809/future_perfect_sized.0.jpg”,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:true},{“base_type”:”EntryGroup”,”id”:30770,”timestamp”:1715162408,”title”:”Politics”,”type”:”SiteGroup”,”url”:”https://www.vox.com/politics”,”slug”:”politics”,”community_logo”:”rnrn rn vox-markrn rn rn rn rn rn”,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:29044,”always_show”:false,”description”:”Vox’s politics team explains everything you need to know about what’s going on in Washington and what it means for your life.”,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:false},{“base_type”:”EntryGroup”,”id”:30778,”timestamp”:1715162408,”title”:”World Politics”,”type”:”SiteGroup”,”url”:”https://www.vox.com/world-politics”,”slug”:”world-politics”,”community_logo”:”rnrn rn vox-markrn rn rn rn rn rn”,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:6760,”always_show”:false,”description”:””,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:false}],”featured_placeable”:false,”video_placeable”:false,”disclaimer”:null,”volume_placement”:”lede”,”video_autoplay”:false,”youtube_url”:”http://bit.ly/voxyoutube”,”facebook_video_url”:””,”play_in_modal”:true,”user_preferences_for_privacy_enabled”:false,”show_branded_logos”:true}”>

Yes, I’ll give $5/month



Leave a Reply

Your email address will not be published. Required fields are marked *