The debate about AI editing for photographers feels exactly like one of those college assignments where both sides have legitimate points, strong feelings run high, and the “right” answer depends on who you ask.
Let me explain what I mean.
Way back when I was in college—what feels like a million years ago—I took a class that was required for anyone in the journalism department. We lovingly called it “Info Hell,” though its official name was Information Gathering. The premise of this course was that you had to find a topic that reasonably had two sides to the argument, where both could be right or wrong depending on your perspective. There wasn’t really one side that was more right than the other, but people felt strongly either way, and you could potentially, through reasoning and solid research, convince someone on the other side to come to yours.
It couldn’t be something like abortion or another topic so polarizing that you’d never be able to change someone’s opinion on it. At the time, I chose whether or not juveniles should be charged as adults in criminal cases. It was an incredibly interesting topic, and I feel like I learned a ton throughout the process—not only about where I stood on the issue, but also about the ability to change our minds as we discover new things.
If I Were Taking That Class Today…
I’d choose one of these hot-button topics that have the photography world in an uproar:
- Should we post children’s images online in this AI era?
- Will using AI editing tools eventually replace photographers entirely?
The question of whether AI editing for photographers is ethical, safe, or smart has divided the community. Some see it as the future of efficient workflow. Others see it as training our own replacement.
These questions have photographers—especially newborn photographers—genuinely worried. Should we show our subjects’ faces online? Can AI steal our children’s identities or put them at risk if anyone can see their faces? And what about our work being used to train algorithms that could eventually replace us?
My Initial Instinct
Here’s the thing: I’ve actually had a client who was absolutely outraged at me for posting her baby’s picture online. She was convinced it was her child and that I’d violated her trust.
Except it wasn’t her baby.
It was a different baby that looked very similar—same setup, same props from my closet, same blue wraps and bowls that I’d used for her session. When you’re a newborn photographer who uses a formula (and let’s be honest, we all have our formulas), babies can look remarkably similar to each other. If you pick blue and another family picks blue, and both babies have blue eyes, blonde hair, and are Caucasian, you might end up with images that are very, very similar.
I do the same poses. I use the same props. It’s a system that works.
So when that client saw what she thought was her baby online, I had to ask myself: Does that mean these babies are actually at risk of being identified? Are we overreacting to the AI threat? Or are there legitimate concerns we should be addressing?
I decided to dig deep. I asked Claude AI to research these topics and present an unbiased look at both sides of the argument. Here’s what we discovered, and why I believe the answer is more nuanced than the panic would suggest.

The History of Photo Manipulation: We’ve Actually Been Here Before
Before we dive into the AI concerns, let’s get some perspective. There’s this narrative going around that photo manipulation is new, that we’re entering some unprecedented era of fake images and stolen identities.
That narrative is completely wrong.
Photo retouching is as old as photography itself. The first known example of photo manipulation happened in 1846—just 21 years after the first photograph was ever created. Photographer Calvert Richard Jones literally painted out an extra friar from a photograph using India ink on the negative because he didn’t like how the composition looked with five people instead of four. You can see the actual photograph at the Metropolitan Museum of Art.)
The Pre-Digital Era
From the 1840s through the 1980s, photographers manipulated images using:
- Darkroom techniques: Dodging (lightening areas) and burning (darkening areas), which Ansel Adams famously mastered
- Physical alteration: Scratching negatives, painting on negatives with India ink
- Airbrushing: Physically painting on prints to remove blemishes and create flawless skin
- Combination printing: Merging multiple negatives to create composite images
- Retouching machines: In the 1930s, George Hurrell’s retoucher James Sharp used a vibrating machine to smooth Joan Crawford’s skin and remove fine lines from her face (see the before/after comparison at PetaPixel)
And it wasn’t just for glamour and beauty. Photo manipulation was widely used for political purposes. Stalin routinely had people erased from photographs after they fell out of favor—the most famous example being Nikolai Yezhov, who was painted out of a photo after his execution, replaced by water from the Moscow Canal. Governments fabricated meetings that never took place. The 1942 photo of Benito Mussolini riding a horse had the stableman completely removed to make it look more heroic. (or so wikipedia claims)
In 1982, National Geographic famously “squeezed” two pyramids closer together digitally to make them fit their cover format better—a decision they later admitted was a mistake, but it showed that even respected publications were manipulating reality for aesthetic purposes.
The Photoshop Revolution
Then came February 19, 1990, when Adobe Photoshop 1.0 hit the market for $895 (about $2,100 in today’s money). Before Photoshop, professional photo retouching on high-end Scitex systems cost around $300 per hour for basic work. Suddenly, that same level of manipulation was available to anyone with a Mac and less than a thousand dollars.
Photoshop didn’t invent photo manipulation—it made it accessible, fast, and virtually invisible.
Key innovations that changed everything:
- Layers (1994): Non-destructive editing that gave unprecedented control
- Healing Brush (2002): Revolutionary tool for removing blemishes seamlessly
- Content-Aware Fill (2010): Intelligent removal of objects from images
- AI Features (2020+): Neural Filters, Sky Replacement, and now Generative Fill
The Critical Takeaway
The ethics of photo manipulation didn’t change with Photoshop or AI. What changed was the accessibility, speed, and invisibility of the edits.
Photographers have ALWAYS manipulated images. We’ve always removed blemishes, smoothed skin, adjusted lighting, removed distracting elements, and created idealized versions of reality. Each technological leap—from darkroom techniques to Photoshop to AI—brought the same concerns we’re hearing now:
- “This will put retouchers out of business!”
- “No one will trust photographs anymore!”
- “This is cheating/unethical/the death of real photography!”
And yet, here we are. Photography adapted, evolved, and survived every previous technological revolution. The question is: Is AI different this time, or is it just the next chapter in a very old story?
The Evoto Controversy: What Actually Happened and Why Photographers Are Furious
If you’re in the newborn or portrait photography space, you’ve probably heard about what photographers are calling “Headshotgate.” This controversy from January 2026 perfectly illustrates the complex relationship between photographers and AI tools—and why trust matters so much.
What Happened
Evoto AI has positioned itself for years as a tool FOR professional photographers. Their software helps with portrait retouching—removing flyaway hair, fixing eyeglass glare, smoothing skin, all the tedious stuff that eats up hours in post-production. Photographers loved it. Many of us became paying customers, uploaded thousands of client images for editing, recommended the software to colleagues, and some even became official Evoto ambassadors.
Then came Imaging USA 2026 in Nashville, one of the biggest professional photography conferences in the country. While Evoto had a booth at the show, news broke that they had quietly launched a separate consumer-facing product: the “Online AI Headshot Generator.”
Here’s what made photographers furious: This wasn’t just another editing tool for professionals. It was a public website where ANYONE could upload a selfie and instantly get polished, professional-looking corporate headshots. No photographer needed. No lighting setup. No posing expertise. Just upload and download.
And the marketing was explicit: It was “faster, cheaper, cuts studio costs” compared to booking a professional photographer. The FAQ literally said it was an alternative to hiring a traditional headshot studio.
Photographers who’d been paying for Evoto, promoting it, and uploading their client work felt utterly betrayed. The hand that had been feeding Evoto bit back hard—people reportedly flipped off the Evoto booth at Imaging USA and booed when representatives walked by.
Evoto’s Response
Evoto quickly pulled the headshot generator and issued an apology. They claimed:
- This was just a “technical pilot” that went live accidentally
- It “missed the mark” and “crossed a line”
- They DON’T use customer images for AI training—only commercially licensed imagery
- They understand they violated the trust photographers placed in them
But here’s why many remain skeptical: A fully functional website with examples, detailed FAQ, pricing structure, and marketing copy doesn’t look like an “accident.” It looks like a product launch that got discovered before the company was ready to deal with the backlash.
The Two Legitimate Sides
Side 1: Photographers’ Valid Concerns
The outrage isn’t just emotional—there are real issues here:
- Betrayal of trust: Photographers funded Evoto’s development through subscriptions, promoted it to the community, and uploaded client work. Discovering that Evoto built a tool designed to replace those same photographers feels like a knife in the back.
- Training data questions: The AI-generated headshots look suspiciously similar to professional headshot photography style—exactly the kind of work photographers upload to Evoto for editing. Despite Evoto’s denials, photographers wonder: Were our images used for training?
- Economic threat: It’s one thing to compete with other photographers. It’s another to have the software you’re paying for build your competition.
- The “accident” claim: Many don’t buy it. As photographer Sal Cincotta (a former Evoto ambassador) pointed out, you don’t accidentally create a fully branded, functional website with marketing materials.
Side 2: The Uncomfortable Reality
But let’s also acknowledge the other side:
- AI headshot generators already exist: Evoto didn’t create this category. Multiple companies offer similar services. Removing Evoto’s version doesn’t make the technology go away.
- Evoto claims legitimate data practices: They say they use only commercially licensed and purchased imagery for training, not customer uploads. Whether you believe them is a trust question, but there’s no proof they’re lying. (so far anyway)
- Technology marches forward: Software companies exist to develop and sell technology. Evoto isn’t “evil” for building AI tools—they’re doing what tech companies do.
- The threat is real regardless: Whether or not Evoto specifically trains on your images, AI systems can already learn and replicate photography styles from publicly available work.
What This Controversy Actually Revealed
This entire situation exposed something crucial that gets lost in the panic: Not all photography is equally vulnerable to AI disruption.
And that’s what we need to talk about next.
Which Photography Businesses Are Actually Threatened by AI?
This is where the conversation needs to get specific, because blanket statements like “AI will replace photographers” miss the nuance entirely.
HIGH RISK: Headshot & Corporate Photography
Value proposition: A professional-looking image for business use (LinkedIn, company website, press kit)
What clients actually need: A polished, credible appearance
Do clients care if it’s “real”? Not really. They need a professional image that makes them look good. The moment captured doesn’t have inherent meaning beyond the visual result.
Can AI deliver this? Yes. Upload a selfie, get a professional-looking headshot. For many clients, that’s good enough.
Replacement risk: VERY HIGH
This is why headshot photographers are justifiably concerned. Their entire value proposition can be replicated by AI for a fraction of the cost. The person who needs a quick LinkedIn headshot and doesn’t care about the “experience” of a professional shoot will absolutely use an AI generator if it saves them several hundreds if not thousands of dollars.
LOW RISK: Experiential & Documentary Photography (That’s Us!)
Value proposition: Captured THIS moment with THIS baby/couple/family
What clients actually need: Memory documentation of a real, irreplaceable moment in time
Do clients care if it’s “real”? ABSOLUTELY. That’s the entire point. They want a photo of their actual baby at this actual age.
Can AI deliver this? No. AI can generate a beautiful image of “a baby,” but it cannot capture the actual moment that actually happened with their actual child.
Replacement risk: VERY LOW for premium clients
The Critical Distinction
Think about what your $2,000+ newborn client is actually buying:
They’re NOT buying “a pretty picture of a baby in a nice setup with good lighting.”
They’re buying:
- Documentation of their specific baby at this unrepeatable stage
- A captured memory of their newborn’s first week of life
- Something real to show their child as they grow up
- The experience of having a professional safely pose and photograph their precious infant
- Your expertise in soothing a fussy baby and getting the shots even when things are difficult
- Multiple angles and genuine moments from an actual session
- A keepsake that represents an actual moment in time
An AI-generated baby that looks “close enough” is not a substitute for any of those things.
The Market Segmentation Reality
Low-End Market (Price-Sensitive Clients)
- These clients want “good enough”
- They would use AI if it’s cheaper/easier
- Before AI, they used: iPhone + VSCO filters, friend with a “nice camera,” $50 Craigslist photographers
- This market IS being disrupted by AI
- But these were never your $2,000 clients anyway
High-End Market (Your Actual Clients)
- They want documented reality of their specific baby
- They value experience, expertise, safety, artistry
- They want something authentic to treasure as their child grows
- They understand the difference between “a nice photo of a baby” and “a photo of MY baby”
- This market is NOT being disrupted by AI
This is similar to how Walmart portrait studios got disrupted by phone cameras, but fine art family photographers remained strong because their clients were buying something fundamentally different.
The person willing to accept an AI-generated baby instead of hiring you was never going to pay $2,000 for your services anyway.
The AI Training Data Question: Should Photographers Actually Fear It?
Okay, this is where we need to get really honest about what’s happening with our images and whether using AI editing tools makes it worse.
The Concern
When you upload images to AI editing platforms like Evoto, the fear is:
- Your images become training data for AI models
- Your copyrighted work teaches the AI to replicate your style
- Eventually, AI can generate images “in the style of [Your Studio Name]”
- You’ve essentially trained your own replacement
It’s a reasonable concern. And if you’re reading the panic posts in Facebook groups, it sounds terrifying.
The Reality We Need to Face
Here’s what’s actually happening:
The major AI models have already scraped the entire internet.
DALL-E, Midjourney, Stable Diffusion, and others were trained on billions of images collected from:
- Instagram (all public posts)
- Photography websites
- Blogs
- Portfolio sites
- Anywhere images are publicly accessible
Your Instagram feed? Already in the training data. Your website portfolio? Already in the training data. That blog post from 2019 with your best work? Already in the training data.
Research has found that large datasets like LAION-5B (used to train popular AI tools) contain identifiable photos of real people, including children, scraped from personal blogs, photo-sharing sites, and social media. The internet has already been harvested.
What This Means For AI Learning Your Style
Right now, today, someone can go to any AI image generator and type:
“Newborn baby photography in the style of [Your Studio Name]”
Or:
“Baby photo with soft lighting, cream tones, wrapped in knit blanket, brown wooden bowl prop, overhead angle, professional newborn photography aesthetic”
And the AI will generate something that looks remarkably similar to your work.
This is already happening. Whether you use Evoto or not.
AI tools can replicate:
- Your lighting style
- Your color grading preferences
- Your composition choices
- Your editing aesthetic
- Your typical props and setups
- Your posing style
Anyone can screenshot your Instagram posts, save images from your website, or download your blog images and feed them into AI image generators for style reference. Some AI tools specifically let you upload a reference image and say “make it look like this.”
The Question That Actually Matters
So using Evoto to retouch your images doesn’t meaningfully change this equation. Your style is already learnable by AI from your publicly posted work.
The real question isn’t: “Will AI learn my style if I use these tools?”
The real question is: “Does using AI tools help my workflow while I serve clients who value what AI fundamentally cannot replace?”
What AI Can vs. Cannot Replicate
AI CAN Replicate:
- Your visual aesthetic
- Your editing style
- The “look” of your work
- Technical execution
- Common poses and setups
AI CANNOT Replicate:
- The actual moment with the actual baby
- The experience of your session
- Your ability to soothe a crying newborn
- Your expertise in safe posing
- The relationship you build with clients
- The story you’re documenting
- A real memory parents can treasure
- Something authentic to show the child when they grow up
If someone prompts an AI: “Create a newborn photo that looks like it was shot by Glean & Co Photography,” they might get something with similar aesthetic qualities.
But they won’t get a photo of their actual baby. They’ll get a generic AI-generated infant that doesn’t exist. (and possibly has an extra toe or two)
As I said earlier, the clients who are willing to accept that were never going to pay $2,000+ for your services anyway.
Children’s Images Online: Understanding the Real Risks
Now let’s tackle the other big question: Should we be posting children’s images online at all in this AI era?
This is where parents are genuinely scared, and honestly, some of the concerns are based on real threats. But—and this is important—the risk profile varies dramatically depending on the context and how images are shared.
The Parental Fear Chain
Parents worry about:
1. Deepfakes
- AI can create convincing digital clones with as few as 20 photos
- Fake videos can show a child saying or doing things they never did
- Technology is advancing rapidly and becoming easier to use
2. Sexual Exploitation
- Approximately 90% of deepfake content is explicit
- 99% of AI-generated abuse material targets women and girls
- “Nudifier” apps can digitally undress anyone in a photo
- These apps are freely accessible and generated $36 million annually
3. Identity Theft
- Barclays Bank estimates that by 2030, 7.4 million cases of identity fraud per year could be linked to parents oversharing online
- With enough photos and information, criminals can create fake IDs using a child’s face
- Biometric data from facial images could be exploited in the future
4. Future Loss of Control
- Average 5-year-old already has 1,500+ photos online (uploaded by parents without consent)
- Once online, images are permanent—living in backups, caches, screenshots forever
- Children grow up with digital footprints they never agreed to
- Future employers, partners, anyone can access their childhood photos
These Fears Are Based on Real Issues
This isn’t paranoia. Real things are happening:
- New Jersey high school case: 30+ teen girls had deepfake nude images created by classmates using regular school photos
- Take It Down Act: Signed into law in 2025 because non-consensual fake nudes became such a widespread problem
- Voice cloning scams: Criminals use AI-generated voice clones of children to call parents demanding ransom
- Child sexual abuse material (CSAM): Can now be generated at scale using real children’s faces from innocent photos online
The Deutsche Telekom campaign created a powerful (and unsettling) advertisement featuring “Ella,” a 9-year-old whose image was aged using deepfake technology to show her as an adult, warning about the consequences of “sharenting”—parents sharing children’s content online.
So yes, the threats are real.
BUT—The Risk Profile Matters Enormously
Here’s what gets lost in the panic: Not all online images carry the same risk.
HIGH RISK Scenario:
- Parent posts daily photos of “Emma” on public Instagram
- Captions include: school name, activities, birthday, “Emma’s first day at Bright Futures Preschool!”
- Photos are geo-tagged at home, the local park, grandma’s house
- 2,000 public followers including strangers, acquaintances, friends of friends
- Ongoing documentation of Emma’s routine, schedule, and life
Why this is risky: A predator could identify Emma specifically, know where she goes to school, understand her routine, locate her home, and build a complete profile. The ongoing nature of the sharing means they can track her in real-time.
LOW RISK Scenario:
- Professional photographer posts an anonymous newborn photo
- No name mentioned anywhere
- No personal details in caption (“Loving these soft blue tones”)
- No location metadata or geo-tagging
- Watermarked with business name
- Single image from one session, not ongoing documentation
- Used to showcase photography skill and artistic style
Why this is low risk: There is no way to identify or locate this specific child. The image exists as “a beautiful newborn in a blue setup,” not as “Emma Johnson, born January 15th, 2026, lives at 123 Main Street, daughter of Sarah and Mike Johnson, attends Little Learners Daycare.”
The Critical Distinction
The risk isn’t simply “photo of child online.”
The risk is: Identifying information + accessibility + ongoing documentation = targetable subject
An anonymous portfolio image has a fundamentally different risk profile than a parent’s oversharing on social media.
Think about it this way:
- A child in the background of a tourist’s vacation photo at Disneyland = extremely low risk
- A child with a name tag at school, posted daily by parents on public accounts = high risk
The anonymous newborn in your portfolio is much closer to the first scenario than the second.
What About AI Training on These Images?
Here’s an uncomfortable truth: If someone wanted to train AI on “baby faces” for nefarious purposes, they have access to:
- Billions of family photos already online
- Stock photography libraries
- Medical imaging databases
- Publicly shared photos from the last 20 years of social media
Your anonymous portfolio image of one baby isn’t meaningfully adding to that ocean of available data in a way that creates specific risk for that specific child.
The baby isn’t identifiable. The baby’s name, location, routine, and personal information aren’t connected to the image. (Unless the parents themselves post it) In terms of AI training, it’s just another face in a dataset of billions—with no identifying metadata attached.
What This Means for Professional Photographers
So where does all this research leave us as newborn photographers? What should we actually be doing?
When Clients Question Model Releases
This is where it gets practical. You’re going to have clients who are genuinely concerned about their baby’s image being online. They’ve read the scary articles. They’re worried about AI and deepfakes and identity theft.
Here’s what you can tell them about how you protect their child:
✅ No real names used – You never caption images with the baby’s actual name or family name
✅ No identifiable locations – You don’t mention the city, show recognizable landmarks, or include geo-tagged data (particularly important for newborn photographers doing in-home sessions)
✅ No distinctive identifying features – You avoid showing unique birthmarks, medical conditions, or anything that could specifically identify this child vs. any other baby
✅ Metadata stripped – You remove GPS coordinates, timestamps, device information, and any other data embedded in the file
✅ Single session images only – You’re not documenting their child’s ongoing life, school, routine, or any information that creates a timeline
✅ Professional context – Your portfolio presents this as artistic work showcasing your skill, not as personal documentation of a named child
✅ Anonymous presentation – Even if someone wanted to misuse the image, there’s no identifying information to connect it to this specific baby and family
Making The Distinction Clear
You can explain it this way:
“What makes professional portfolio use lower risk than social media sharing is the anonymity and context. Your baby’s photo in my portfolio looks like ‘a beautiful newborn in a blue setup’—not ‘Emma Johnson, born January 15th, lives at 123 Main Street, attends Little Learners Daycare.’ There’s no way for someone to identify, locate, or target your specific child from an anonymous portfolio image.”
You might also share the research about risk profiles:
“The scenarios where children have been victimized by AI deepfakes or online predators involve ongoing documentation with identifying information—parents posting daily updates with names, locations, schools, and routines on public accounts. That’s completely different from a single anonymous portfolio image used in a professional context.”
Respecting Client Choice
That said, parental fear is valid even when the actual risk is low. Some parents have personal reasons for wanting extra privacy, and that’s okay.
Offer alternatives:
- Images from behind that don’t show the baby’s face
- Limited portfolio use (in-person samples only, not posted online)
- Time-limited posting (image will be removed from website/social after 1 year)
- Private gallery access only (password-protected, not indexed by search engines)
- Pricing adjustment if needed (discount for full model release vs. restricted use)
The key is to be understanding and flexible while also educating about what actually creates risk vs. what’s relatively safe.
The Story That Started This Section
Remember my client who was convinced I’d posted her baby? That situation perfectly illustrates why we need to have these conversations and set clear expectations.
When you use formulas—the same props, poses, setups, and editing style—babies photographed weeks or months apart can look incredibly similar. If both families chose blue, and both babies have similar coloring, the images might be nearly identical.
That doesn’t mean one baby can be identified as another. It means newborn photography often produces similar-looking results because newborn babies, frankly, look pretty similar to each other, especially when posed and styled the same way.
My client’s fear was understandable, but it was based on misunderstanding how newborn photography works and what actual risk looks like.

Taking Control: AI Editing Privacy Settings Photographers Should Know
If you decide to use Evoto (or are already using it), there are specific settings you can adjust to further protect your privacy and control how your data is used.
Important caveat: We have to trust that Evoto is honoring these settings and not using our images without permission. There’s no way to audit their internal processes. But at the very least, adjusting these settings means you’re not explicitly giving them permission to use your content—and if they violated that, it would be a clear breach of their own terms.
Here’s What You Can Control:
1. Export Settings – Image Caption/Description
When you export images from Evoto, you have a choice about what metadata gets embedded:
- “Show Evoto” – This adds Evoto information to your exported images
- “Show original info” – This preserves your original metadata
Recommendation: Select “Show original info” so your exported images maintain your camera settings and copyright information, not Evoto’s watermarking in the metadata.

2. Content Analysis – The Big One
This is the setting that matters most for the training data concern:
“Allow my content to be analyzed by Evoto for product improvement and development purposes”
The description states: “Evoto may analyze your content using techniques such as machine learning to improve our products and services. You may opt out of anonymous data collection at any time.”
Recommendation: Turn this OFF if you don’t want your images potentially used for improving their AI models.
Here’s the thing: Evoto says this analysis is “anonymous” and for “product improvement.” That could mean they’re analyzing general patterns (like “users tend to smooth skin at this intensity level”) without actually training on your specific images. Or it could mean something more. We don’t know for certain.
What we DO know is that having this setting ON gives them explicit permission to analyze your content with machine learning. Having it OFF removes that permission.
3. Software Usage Information

“Share my usage information”
This collects data about how you use the software—what features you click, how long tasks take, etc. This is standard software analytics that helps them improve the user interface and performance.
Recommendation: This one is up to your comfort level. This type of usage data (not your actual images, but your interaction patterns) is pretty standard across most software. If you want maximum privacy, turn it off. If you want to help improve the software’s usability, you can leave it on.
4. Cloud Storage
“Enable Cloud Storage”
This determines whether your edited images are stored on Evoto’s cloud servers. According to their description: “This setting enables cloud storage such as AI Color Match. You may disable it at any time. Once disabled, your content will no longer be saved on the cloud. Please manually stop any ongoing sync and cloud-related tasks before disabling. If not manually stopped, they will continue to finish. You can still access and use content already saved to the cloud after you delete. You can re-enable this setting at any time.”
Recommendation: If you prefer to keep everything local and not have your images on their servers at all, turn this OFF. You’ll lose cloud-based features like AI Color Match, but your images stay entirely on your computer.
How to Access These Settings
- Open Evoto
- Click on Settings (usually a gear icon)
- Navigate to Privacy in the left sidebar
- Review and adjust each setting based on your comfort level
The Trust Factor
Here’s what we need to be realistic about: Even with all these settings turned off, we’re trusting Evoto’s word that they’re:
- Not training on our images despite their claims
- Honoring our privacy settings
- Actually deleting data when we disable cloud storage
- Being transparent about their data practices
There’s no independent audit. No way to verify. That’s the uncomfortable reality of using any cloud-connected software.
This is why I keep coming back to the same point: The real risk isn’t specifically about Evoto—it’s about any of your publicly posted work being accessible to AI systems that have already scraped the internet. These settings give you more control over one specific company, but they don’t change the broader landscape.
Use these settings to protect your privacy where you can. But also recognize that the horse has largely left the barn when it comes to AI learning from publicly available photography.
The AI Editing Tools Question: Should Photographers Actually Use Them?
Alright, let’s bring this all together. After diving deep into the research (Info hell taught me well), examining both sides of the argument, looking at the actual threats and the overblown fears, here’s where I land on whether photographers should use AI editing tools like Evoto.
What I Now Believe
1. AI can already replicate your style from publicly posted work
This ship has sailed. Your Instagram posts, website portfolio, blog images, Pinterest pins—it’s all accessible. The major AI models have already scraped the internet. Someone can prompt an AI right now to generate images “in your style,” and using or not using Evoto doesn’t change that reality.
The genie is out of the bottle. The question isn’t whether AI can learn from your work (it can and has), but what you do with that knowledge.
2. Your business model inherently protects you
This is the most important realization from all this research: You’re not selling “pretty baby aesthetic.” You’re selling “documentation of THIS baby at THIS moment.”
AI cannot replace that core value proposition.
3. The low-end market was always going to find cheap alternatives
These were never your $2,000+ clients anyway. You’re not losing customers to AI—you’re watching a market segment that was never yours get disrupted by cheaper technology. That’s not the same thing.
4. Tools that improve workflow are worth considering—if they serve your real goals
This is where it gets personal and practical. If Evoto or similar AI tools:
- Save you hours of editing time per session
- Let you take on more sessions without burning out
- Give you more time with your family (remember them?)
- Improve your final product quality
- Help you deliver work faster to delighted clients
Then the theoretical risk—which exists whether you use the tool or not—may be worth the very real, tangible benefits.
I’m not saying every photographer should use AI editing tools. I’m saying the decision should be based on practical workflow benefits vs. your specific needs, not on fear of a threat that exists regardless of your choice.
5. Focus your energy on what AI cannot replace
This is the strategic position that makes sense:
- The experience of working with you
- Your expertise in safely posing newborns
- Your ability to soothe and photograph a fussy baby
- The captured authentic moments of a real session
- The story you’re documenting
- The relationship you build with families
- The real memory you’re creating
These are your competitive advantages. The client experience is what clients are actually paying for. That cannot be replicated by AI, now or in the foreseeable future.
My Personal Decision
I’m continuing to use AI-assisted editing tools like Evoto in my workflow because:

They improve my efficiency and final product – I can deliver better work faster, which means happier clients and more time for my family
The training data concern exists either way – My publicly posted work is already accessible to AI systems. Using Evoto doesn’t meaningfully change my exposure.
My editing style is part of my value, but not the whole story – I’m capturing irreplaceable moments with a signature aesthetic. AI might copy my look, but it can’t photograph THIS baby for families who want documentation of their actual child, not just a beautiful image of any baby
I serve clients who get it – My $2,000+ clients value documentation of their actual baby. They’re not my target market for AI-generated alternatives, and frankly, the person who would choose AI-generated baby photos was never going to choose my services regardless
Technology serves the photographer, not the other way around – Just like Photoshop before it, AI is a tool. I’m using it to serve my clients better, work more efficiently, and create a sustainable business that doesn’t require me to work 80-hour weeks.
Want to get a taste of Evoto AI? Check out The Newborn Photographer’s Guide to Evoto AI Mini Course and decide for yourself if it’s right for your workflow.
What About You?
Your answer might be different, and that’s completely okay.
Maybe you love the meditative process of manual retouching and don’t want to change your workflow—that’s valid. The techniques I’ve been refining since 2007 still work beautifully.
Maybe you’re genuinely uncomfortable with how these AI companies operate and want to vote with your wallet—that’s a principled position I respect.
Maybe you’re in a different photography niche (like headshots) where the threat profile is genuinely different—then your calculation changes.
The point isn’t that everyone should reach the same conclusion. The point is that we should make informed decisions based on understanding both sides of the issue, not operate from fear or hype.
Want to Learn More About Editing—With or Without AI?
I teach both approaches in the Newborn Editing Academy.
If you’re firmly in the “manual editing only” camp, no problem—I’ve been refining these tried-and-true techniques since 2007, and they absolutely still work. You don’t need AI to create beautiful, professional newborn images. The foundational skills of color correction, skin retouching, and artistic enhancement are timeless.
But if you’re curious about AI tools and want to explore Evoto or other options with someone who’s done the research, understands the concerns, and can show you how to use them effectively, we cover that too.
The beauty of education is that it empowers you to make your own informed choice rather than operating from fear or hype. You get to decide what tools serve your business, your workflow, and your values.
The Lesson from “Info Hell”
The real lesson from that information gathering class all those years ago wasn’t about juveniles or criminal justice systems. It was about being willing to examine both sides of an issue honestly and change your mind when presented with new information.
It was about intellectual humility—the ability to say, “I thought X, but after looking at the evidence, I now think Y.”
The world is changing. The tools are changing. Entire industries are being disrupted.
But the fundamental human need to document real moments, to capture authentic memories, to preserve irreplaceable times with the people we love—that’s not changing.
The value of capturing real, unrepeatable moments remains unchanged.
That’s where I’m placing my bet. That’s the business I’m building. That’s what AI cannot take from us.
I’ll use AI tools where they make me more efficient and let me serve my clients better. I’ll protect children’s privacy by using anonymous portfolio images without identifying information. I’ll educate my clients about real risks vs. overblown fears. And I’ll keep showing up to capture those precious first weeks of life—the actual babies, the real moments, the memories that matter.
Because at the end of the day, when a family looks back at their newborn photos ten years from now, they’re not going to care what editing software I used.
They’re going to care that it’s really their baby. Really that moment. Really that time in their lives.
And no AI can ever replace that.
Resources & Further Learning
Want to dig deeper into any of these topics? Here are some helpful resources:
- Evoto’s Official Privacy Policy – Read their complete terms on data usage and privacy practices
- Evoto’s Response to the Controversy – Their official statement about the headshot generator incident (Facebook page)
- Understanding Deepfakes and Child Safety – Comprehensive guide from Protect Young Eyes on AI risks for children
- Verizon’s Guide: AI Deepfakes and Your Kid’s Digital Footprint – Practical tips for protecting children’s online presence
- How to Protect Your Images from AI Training – Technical steps from Pixsy on metadata and image protection
- Photography Model Release Best Practices – ASMP’s guide to legal releases and client agreements
- Newborn Editing Academy – Whether you’re team manual editing (I’ve been perfecting these techniques since 2007) or ready to explore AI-assisted tools like Evoto, we cover it all. Not sure where you stand? We teach both approaches so you can make an informed decision for your own workflow.
Let’s Keep The Conversation Going
What’s your take on all of this? Are you using AI editing tools in your workflow? How do you handle client concerns about posting images online? Have you changed your mind on any of these issues after doing your own research?
I’d genuinely love to hear your perspective. This is one of those topics where reasonable people can disagree, and that’s okay—as long as we’re making informed choices rather than operating from fear or hype.
Drop your thoughts in the comments. Let’s learn from each other.
Paige | Glean & Co Photography Newborn Photography • Boise, Idaho

LEAVE A COMMENT
Comments