[HN Gopher] AI real-time human full-body photo generator
       ___________________________________________________________________
        
       AI real-time human full-body photo generator
        
       Author : bookofjoe
       Score  : 318 points
       Date   : 2023-08-23 15:28 UTC (7 hours ago)
        
 (HTM) web link (generated.photos)
 (TXT) w3m dump (generated.photos)
        
       | Madmallard wrote:
       | Accuracy on using an existing face seems pretty off. Certain
       | positions have terrible accuracy on body parts. Why would you
       | care to use this again? This just seems like another AI SaaS scam
       | project where they're basically charging others to use their
       | A100s with their copy-pasted implemented version of the research
       | paper algorithms. These should just all be outlawed IMO.
        
       | oktwtf wrote:
       | I always thought peak of popular culture had a bit of a negative
       | effect on the body image of the rest. Now we're going to let ai
       | define the ideals... this might be slippery.
        
         | lelanthran wrote:
         | "Now"?
         | 
         | I doubt everyone is going to want to have six mangled fingers
         | on one hand and a claw on the other.
        
           | oktwtf wrote:
           | Okay, agreed. I ran about 15-20 gens through the site, got
           | mangled feet/legs, disconnected appendages, extra appendages,
           | nudity, nudity imprinted onto clothing, just bizarre stuff.
           | 
           | Tomorrow's "Now", maybe.
        
       | Peritract wrote:
       | We know about all of the potential harms that deepfakes can
       | cause, the problems with inherent bias in training data, etc.
       | 
       | Creating/publicising a tool that winks at these issues (consider
       | the difference between the poses offered for 'male' and 'female'
       | bodies) but does nothing to mitigate them - and a lot to enable
       | them - is irresponsible at best.
        
         | RetroTechie wrote:
         | Perhaps more to the point:
         | 
         | All these AI content generators are still early stage. So it's
         | _kind of_ wild west for the time being.
         | 
         | First cars were what? Horse carriage with an engine duct taped
         | onto it? Only when they became more numerous, things like
         | traffic rules, reliable brakes & steering etc became important.
         | 
         | We're in the engine-with-wheels stage. Have fun, be happy.
        
         | UweSchmidt wrote:
         | It would be quite hard to make any AI tool that preemptively
         | avoids the wide range of potential issues that you've
         | mentioned. If tool makers are forced to always err on the side
         | of caution, it's likely that the resulting tool ends up
         | disappointing.
         | 
         | Only when published, and when put into context of the entire
         | work, could a creation deemed harmful. A tool should not, for
         | example, prevent you from making a bunch of images with ominous
         | poses, from which you select one to use with an article that
         | discusses the history of ominous poses.
        
           | EGreg wrote:
           | In that case enjoy our proof of concept:
           | 
           | https://app.engageusers.ai
           | 
           | Everything from realistic faces to realistic posts. We tried
           | to make it as ethical as possible in multiple ways. But
           | ultimately it is designed to spur conversation on topics that
           | need more kickstarting engagement...
        
           | Peritract wrote:
           | Just because it's hard to make a tool that _can 't_ be used
           | in negative ways doesn't mean that it's a good idea to make a
           | tool that (charitably) makes specifically negative uses easy
           | and (uncharitably) is deliberately designed for them.
           | 
           | Tool makes do err on the side of caution all the time - we
           | **** out passwords so users don't share them as easily, we
           | put safety catches in secateurs. "Build in safeguards against
           | the obvious issues" is a basic design step.
        
             | UweSchmidt wrote:
             | - your critique is both vague, but at the same time touches
             | a sensitive area, implying a wrongdoing by the tool authors
             | that can't be refuted or fixed easily. What specifically
             | bothers you? Consider that active Twitter discussions
             | uncover and point out troublesome issues almost faster than
             | the general public can understand and digest.
             | 
             | - assuming you found an egregious issue, do you also double
             | down on maintaining that the tool is 'deliberately designed
             | to make negative uses easy'? How so?
             | 
             | - I disagree with the 'safety catches' metaphor and would
             | offer the 'hammer' metaphor instead.
             | 
             | - Actually, with the rapid development in this field I
             | expect that anyone will be able to locally prompt for any
             | content, even movies, soon, limited only by people's taste
             | and imagination; with this realization I don't think I will
             | follow up on this discussion that will surely be outdated
             | in a minute.
        
               | upwardbound wrote:
               | Peritract already called out a specific issue. The male
               | and female options come with different sets of selectable
               | poses, and some of the female poses are pornographic in
               | nature. This promotes the objectification of women.
        
           | notpachet wrote:
           | > If tool makers are forced to always err on the side of
           | caution, it's likely that the resulting tool ends up
           | disappointing
           | 
           | I don't disagree with you entirely, but I still have the
           | feeling like this will make a pretty good epitaph for
           | humanity some day.
        
       | trojan13 wrote:
       | Nice seeing a nuxt-webapp in production.
        
       | smeej wrote:
       | Why is it impossible to generate a male model wearing anything
       | other than rolled denim jean shorts? I've tried things like "long
       | pants" or "ankle-length pants," but I cannot get it to stop
       | putting them all in denim shorts!
        
         | rvbissell wrote:
         | I accidentally ended up with a male in a miniskirt, simply
         | because the clothing defaults didn't change.
        
         | gg80 wrote:
         | Try with cargo pants. After three or four iterations the pants
         | disappeared and were substituted by something that can only be
         | described as an andrologist fever dream.
        
         | system2 wrote:
         | There is a clothing tab which you select whatever you want. I
         | think description doesn't override it.
        
         | dragonwriter wrote:
         | > Why is it impossible to generate a male model wearing
         | anything other than rolled denim jean shorts?
         | 
         | Its not, I got rolled-but-long white denim pants, white shirt,
         | white tie, and white jacket, with white deck-ish shoes but
         | selecting "Formal" on the clothing tab and entering "Clothing:
         | white-tie formalwear".
         | 
         | But, yeah, there is a definite denim bias.
        
         | n8cpdx wrote:
         | I got one with regular shorts: https://generated.photos/human-
         | generator/64e67772190809000bb...
         | 
         | I don't understand why none of these generators want to make a
         | man with body hair. I specified Armenian and Armenians are
         | notoriously hairy.
         | 
         | Edit: this is if I specify "very hairy". Note that for a man,
         | there is a bit of hair in the usual places but still far short
         | of "very hairy" IMO. https://generated.photos/human-
         | generator/64e67aa38448b800095... (And the hair rendering is
         | bad, AI still doesn't understand directionality; there's a very
         | common pattern it should know, especially on the chest)
        
         | dcdc123 wrote:
         | Scroll down to the text field at the bottom and delete
         | everything in it. Then go back and reselect clothing.
        
         | trebligdivad wrote:
         | Yeh; tbh I thought it did a reasonable job with them though
         | https://generated.photos/human-generator/64e666d59563e6000e0...
         | (The hairline is way off, and he's way too muscular) I at least
         | don't spot anything too anatomically wrong.
        
           | kzrdude wrote:
           | Knees are weird, one of them tilted backwards, ears
           | asymmetrical.
        
         | dinkleberg wrote:
         | You might not like it but jorts are the pinnacle of lower body
         | coverings.
         | 
         | That is actually a hilarious issue.
        
         | evan_ wrote:
         | The first female model I generated was supposed to be wearing
         | jeans but was wearing what can only be described as a denim
         | belt with pockets
        
       | kortex wrote:
       | Oh yeah, totally ready for prime time, hyper realistic, SFW
       | filter works great, not at all hallucinations /massive_sarcasm
       | 
       | Actually NSFW, not safe for sanity. That's...not how body parts
       | work:
       | 
       | https://generated.photos/human-generator/64e644f39c8c0400108...
       | 
       | Prompt was "young woman with tattoos in miniskirt" really nothing
       | crazy there. But perhaps the latent space with that particular
       | pose is particularly raunchy.
        
         | philote wrote:
         | Yeah, this was my first attempt:
         | https://generated.photos/human-generator/64e648819c8c0400088...
         | 
         | Not quite right. I am, however, impressed that the fingers are
         | generally "mostly" correct.
        
           | 14 wrote:
           | my first attempt created some kind of gym monster lol
           | https://generated.photos/human-
           | generator/64e64a2f412bec0009b...
        
             | [deleted]
        
             | mrguyorama wrote:
             | This is actually an incredible example of "The longer you
             | look the worse it gets".
             | 
             | She has a second set of boobs where her hips should be!
             | That's not evolutionarily advantageous!
        
             | westernpopular wrote:
             | Your description had me curious, the picture had me
             | laughing out loud
        
             | kortex wrote:
             | _What a time to be alive!_
        
             | binkHN wrote:
             | Definitely the stuff nightmares are made out of!
        
           | kortex wrote:
           | Oof, I thought we banned thalidomide.
           | 
           | I wonder if their pose detection/interpolation struggles for
           | rarer poses, eg "kneeling with legs splayed leaning forward"
           | is quite specific in saucy contexts and fairly sparse in more
           | typical model shoots, so the manifold gets a bit holey, and
           | overlaps with similar poses like one knee up, one hand
           | forward.
        
             | philote wrote:
             | Yeah I think that's it.
             | 
             | Also, it's way too easy to make something that looks like
             | (or basically is) child porn with this tool. I chose Adult
             | but something else keeps triggering it to generate a child-
             | like face like in the image I posted. And as you showed,
             | it's easy to get accidental NSFW pics.
        
         | BiteCode_dev wrote:
         | It's very hard to filter NSFW content. Every site I tried,
         | unstable diffusion, kawaix.com, Mage space, novel ai... They
         | all have some content moderation on (to avoid CP, to keep
         | payment processors happy...), but things leak.
         | 
         | Some are really bad at filtering. Kawaix is particularly
         | terrible at it, because they are new, while mage have upped
         | their game a lot but had many months to do so.
         | 
         | It feels like 2000' again, and it's the wild west.
         | 
         | Plus when you have a horde of teenagers having a whole summer
         | to try prompts from their bed, you get serious pen testing
         | sessions.
        
         | 14 wrote:
         | I've spent the last hour trying to coax it to spit out nsfw
         | images and it definitely is not safe for work lol. I wouldn't
         | even want to post some of the seeds I generated. Nudity is not
         | prevented in this generator.
        
         | [deleted]
        
         | freeflight wrote:
         | There is a SFW filter?
         | 
         | I just let it generate a random woman with no prompt, and it
         | gave me a pretty good result, except there is a mask on the
         | face and literally bloody nude boobs;
         | https://generated.photos/human-generator/64d67874568faa0007a...
         | 
         | edit; I just realized it put in a default prompt
        
           | garyfirestorm wrote:
           | and I got a 'We detected that generated image contains nude
           | content. Try changing parameters.' despite not specifying any
           | such thing
        
         | barbariangrunge wrote:
         | Looks like something from a hellraiser movie
        
           | [deleted]
        
       | noman-land wrote:
       | These all look like cartoons to me.
        
       | ZYXER wrote:
       | ...prone to pron?
        
       | bdowling wrote:
       | > "If you want to use images produced by Human Generator in
       | commercial projects, contact us."
       | 
       | If there is no copyright in AI-generated images, then how can
       | they possibly enforce this?
        
         | jejeyyy77 wrote:
         | they can't.
        
         | [deleted]
        
         | caturopath wrote:
         | > If there is no copyright in AI-generated images, then how can
         | they possibly enforce this?
         | 
         | We don't have precedent here. Whether a person using a website
         | with a generative AI tool counts as having a non-human creator
         | isn't clear, and it seems to me like the answer is that it does
         | have a human creator. Using a horse-hair brush to paint a
         | painting doesn't mean that the painting was created by a horse
         | and isn't subject to copyright. We'll have to find out
         | eventually whether over a dozen settings, some with a gazillion
         | options, and multiple freeform inputs counts as 'not created by
         | a human'.
        
           | wredue wrote:
           | Was there not just precedent set for this?
           | 
           | The horse hair example is nonsense. One might argue that
           | artists take inspiration from other artists to make the
           | argument that what the AI is doing is fine. But the ai is
           | actually only capable of blending what it's been trained on,
           | whereas an artist is not similarly limited. And this is how
           | the horse hair sample is stupid.
        
             | caturopath wrote:
             | > Was there not just precedent set for this?
             | 
             | No.
             | 
             | > One might argue that artists take inspiration from other
             | artists to make the argument that what the AI is doing is
             | fine.
             | 
             | This doesn't seem to address what I took to be the relevant
             | part of IP law - that non-human authors don't create
             | copyrighted works. It was a reductio ad absurdum for
             | minimal non-human involvement. It's probably not the case
             | that a monkey stealing your camera and taking a selfie
             | creates a copyrighted work. It's probably the case that a
             | frog triggering a motion sensor you set up for nature
             | photography does. It's certain painting normally with a
             | horsehair brush does.
             | 
             | Your remarks seem to make some sort of moral appeal, but
             | I'm not sure how it ties into the legal concerns I thought
             | was being raised.
             | 
             | > the ai is actually only capable of blending what it's
             | been trained on, whereas an artist is not similarly
             | limited.
             | 
             | I'm not sure what "blending" means here or what the actual
             | theories of generative art ML systems and of humans here
             | are. To call what the former do "blending" requires such a
             | broad definition I can't tell you if humans are blending as
             | well (at least some of the time, at least materially) when
             | creating works.
        
             | dragonwriter wrote:
             | > Was there not just precedent set for this?
             | 
             | No, there was a recent case where someone tried to claim an
             | AI as an author for copyright and themselves as owner via
             | work for hire, where it was ruled invalid because AI can't
             | be an author under copyright law; the ruling was explicit
             | that it was not addressing copyrightability by humans of
             | images they create using an AI generator as a tool, only
             | the claim of copyright with AI as the author.
        
         | Philpax wrote:
         | There haven't been any solid rulings on the copyright validity
         | of human-driven AI generation yet. There have been a few cases,
         | but they've been muddied by complicating factors (not a human
         | doing the generation - that is, autonomous generation - or the
         | generation being used as a base work for something else).
         | 
         | Additionally, even if there's no copyright, the terms of
         | service may still apply separately (see OpenAI disallowing
         | training a competitor model on output from OpenAI models)
        
           | basch wrote:
           | A copyright is a government granted monopoly. The copyright
           | office has stated they wont grant monopoly privilege for ai
           | generated art. The courts thus far have backed them up.
           | 
           | I would say it doesnt look good at the moment for to try and
           | enforce ownership of something ai generated, it would be an
           | uphill battle, and the default/null position would be that
           | the art is free to use, and unprotected by government.
        
             | dragonwriter wrote:
             | > The copyright office has stated they wont grant monopoly
             | privilege for ai generated art.
             | 
             | No, they haven't.
             | 
             | They've said that _if_ the only human input is a text
             | prompt, _then_ it lacks the required human creativity to be
             | eligible for copyright protection.
        
               | tomcam wrote:
               | Not trying to be combative, but I don't see the
               | difference?
        
               | dragonwriter wrote:
               | Real AI imagegen workflows very often have more input
               | from the human creating the image than a text prompt.
        
               | basch wrote:
               | seems a little circular semantically. if it has
               | significant human input its human generated moreso than
               | ai generated, in which case we are saying the same thing.
        
           | cush wrote:
           | I don't think we're going to see a ruling against copyright
           | in the long term. When the rulings do come, they're going to
           | be complex (not that copyright law isn't already complex). As
           | prompting and working with AIs slowly becomes its own art and
           | skill, it will become clear that works need protection. We've
           | had "intelligent" filters and tools in Photoshop for decades,
           | this is just the next step in that evolution.
           | 
           | The only real problem here is that the original creators of
           | the art that these AIs were trained on didn't consent to this
           | type of use and aren't getting any kind of attribution or
           | payment. If they were recognized and compensated, there'd be
           | really nothing to talk about here - any work could be
           | copyrighted, with whatever derivative status the AI bakes in.
        
           | EMIRELADERO wrote:
           | > Additionally, even if there's no copyright, the terms of
           | service may still apply separately (see OpenAI disallowing
           | training a competitor model on output from OpenAI models)
           | 
           | Aren't contract clauses that relate to the distribution of
           | material preempted by the copyright act?
        
             | tzs wrote:
             | No. However contract clauses only apply to people who are
             | actually parties to the contract.
             | 
             | For example you and I could enter into a contract for me to
             | use AI to generate something that is not copyrightable from
             | data you provide and give you a copy of that thing. There
             | would in general be no legal problem if the contract
             | included restrictions on what you could do with that thing,
             | including restrictions on distributing it.
             | 
             | Part of the quid pro quo of a contract can be one party
             | giving up a right to do something that they would normally
             | have a right to do.
             | 
             | Now suppose the contract did allow you to make and
             | distribute copies as part of your product. Someone else
             | starts making copies of those copies you distributed and
             | distributing those copies.
             | 
             | There is no contract between me and that person, so I would
             | not be able to stop them. I've got no contract with them,
             | and the thing is not copyrighted, so there's nothing that
             | prevents them from copying it.
        
             | dragonwriter wrote:
             | > Aren't contract clauses that relate to the distribution
             | of material preempted by the copyright act?
             | 
             | Generally, no. It's possible for there to be interactions
             | in some cases, but the Copyright Act wouldn't generally
             | preempt contract terms. (Its closer to the other way
             | around, in that--to the extent copyright rights exist that
             | could otherwise be enforced--a relevant contract will
             | generally limit enforcement and recover to breach of
             | contract rather than bare copyright action.)
        
           | bookofjoe wrote:
           | https://news.ycombinator.com/item?id=37195509
           | 
           | https://www.theverge.com/2023/8/19/23838458/ai-generated-
           | art...
        
             | space_fountain wrote:
             | I believe that as the article says despite the headline
             | this case was specifically about a situation where a
             | computer scientist wanted to list the AI as the one
             | creating the work. The case doesn't examine an argument
             | that things can be copyrighted when a human is involved
             | either by filtering the output or even just by developing
             | the algorithm involved and thus the human is the artist and
             | the AI is just a tool. I think what's clear is that legally
             | AI can't itself create a copy righted work just like a
             | camera can't be listed as the author of a work, but it's
             | not clear if a human using AI as a tool either through
             | prompting or filtering counts as a creative act under
             | copyright or if AI generated creations count as derivative
             | works of the models weights.
        
         | wrs wrote:
         | They could do it with a click-through license agreement, but
         | they don't have one of those either. So it seems to have the
         | legal force of a polite request. (IANAL)
        
         | HPsquared wrote:
         | Does this apply to any photo taken by a camera with "AI"
         | filters? There must be a line somewhere.
        
           | HWR_14 wrote:
           | The current line is not based on a maximum AI input but a
           | minimum human input. You aimed the camera, your copyright.
           | You have an AI just create fake pictures and post them
           | without someone in the loop, no copyright. The questions are
           | mostly about how little you have to do.
        
             | pbjtime wrote:
             | While that may be the status today, I feel this is in no
             | way settled.
        
         | gumballindie wrote:
         | Question is why would anyone want to use this since it's so
         | buggy.
        
         | fnordpiglet wrote:
         | There are lots of tools that don't have copyrightable output
         | that require commercial licensing to use.
        
         | phyzome wrote:
         | As of now, copyright of AI-generated images _is not a settled
         | matter_. But I think smart money is on the courts coming down
         | on the side of copyright being applicable.
         | 
         | (If you're thinking of the recent court case, no, that was
         | unrelated; some guy was trying to pull a stunt and the court
         | did not actually rule on the thing you think they did.)
        
           | servercobra wrote:
           | Do you have a link or something re: your second point for
           | those of us who might not know?
        
         | sudobash1 wrote:
         | There are still terms of use, which can dictate how you are
         | allowed to use a website. And there are watermarks in the
         | corners.
        
           | notpachet wrote:
           | We can probably AI those out.
        
         | TehCorwiz wrote:
         | Adobe Fusion 360 education edition limits what I can create
         | with it to non commercial uses only. Despite me owning the
         | copyright for what I produce using it. I don't think you have
         | it right.
        
         | dragonwriter wrote:
         | Its terms of use of the site, you are offered use of their
         | geneator in exchange for agreeing not to use it commercially.
         | If you use it commercially, you are breaking those terms, which
         | they will argue are an enforceable contract. Copyright has
         | nothing to do with it.
        
           | mitthrowaway2 wrote:
           | But what if you use the generator, post the image in an
           | allowed non-commercial context, and then I copy that image
           | and use it commercially? I have no contract with the AI
           | generator company, and you didn't violate yours; it would
           | seem to me that the violation involved is a copyright
           | violation.
        
             | behringer wrote:
             | 100 percent legal to use the generated images as freely as
             | you like in that case.
        
         | disembiggen wrote:
         | if there were, globally, no copyright at all in any "ai"
         | generated images, and one confidently thought there never would
         | be, then simply using the images, in the case that one only
         | needed one or two images, would probably be fine.
         | 
         | however if there were large nations in which the law was still
         | in flux or unclear, or one wanted to generate new images on the
         | fly without fear of rate-limiting or refusal of service, then
         | one would potentially wish to work out an arrangement.
        
       | aerodog wrote:
       | Has the same problems as midjourney et al: you can feed it
       | pictures of a friend or yourself or a celebrity, and the result
       | is always off - not recognizable as them
        
         | Jeff_Brown wrote:
         | Some will surely consider that a feature rather than a bug.
        
           | mrguyorama wrote:
           | But then how am I supposed to be the kind of creepy person
           | that jerks it to poorly modified "nudes" made from Facebook
           | profile pics?
        
             | Jeff_Brown wrote:
             | "Back in my day stalkers had to use their _imagination_ ,
             | by gum."
        
         | Philpax wrote:
         | That's pretty easily fixable if you train a LoRA or similar so
         | that the model has a specific likeness in mind. (You can look
         | at - and despair at - Civitai if you want proof.)
         | 
         | It's harder to do at inference time without training, but I
         | wouldn't assume it'll be impossible forever, especially with
         | the existence of ControlNet.
        
           | gwern wrote:
           | This is a GAN, so you can just project the image of yourself
           | into the latent space (which will give you a near-pixel-
           | perfect reconstruction), fix the identity-relevant variables
           | in the _z_, and edit it as necessary. (No workarounds like
           | finetuning necessary. Just one of the many forgotten
           | advantages of GANs.)
        
             | GaggiX wrote:
             | You can project an image into the latent space with
             | diffusion model too, DDIM inversion.
        
             | DonsDiscountGas wrote:
             | >fix the identity-relevant variables in the _z_
             | 
             | Is that how the latent space works though; Like if it's a
             | 300-dim vector, is the face at locations 0-10?
        
       | samstave wrote:
       | "Human generator is at full capacity, please try again later"
        
         | AuryGlenz wrote:
         | No wonder the birth rate has been dropping.
        
         | bun_at_work wrote:
         | I wonder if it's actually overloaded or if there's a bug.
        
           | bookofjoe wrote:
           | HN Effect
        
             | titaniumtown wrote:
             | hug of death
        
         | [deleted]
        
       | gwern wrote:
       | If you're wondering how it's so fast and cheap and they can
       | generate variants so easily, it's because they're using GANs (see
       | the footer). GANs are way faster than diffusion models because
       | they generate the image in a single forward pass and their true
       | latent space encoding makes editing a breeze.
       | 
       | (And if you're wondering how it can look so good when 'everyone
       | knows GANs don't work because they're too unstable', a widespread
       | myth, repeated by many DL researchers who ought to know better,
       | GANs _can_ scale to high-quality realistic images on billion-
       | image scale datasets, and become more, not less, stable with
       | scale, like many things in deep reinforcement learning. See for
       | example BigGAN on JFT-300m
       | https://arxiv.org/pdf/1809.11096.pdf#page=8&org=deepmind ,
       | GigaGAN https://arxiv.org/abs/2303.05511 , Projected GAN
       | https://arxiv.org/abs/2111.01007 , StyleGAN-XL
       | https://arxiv.org/abs/2202.00273 , or Tensorfork's chaos runs
       | https://gwern.net/gan#tensorfork-chaos-runs . 'Computing is a pop
       | culture'...)
        
         | tavavex wrote:
         | While there's discussion on the topic here - are there any
         | resources that can explain the exact mechanism of how a GAN
         | works for image generation? I have a rough idea of how
         | diffusion models work, but I'm still no AI researcher.
        
         | brucethemoose2 wrote:
         | There was a whole community around ESRGAN img2img finetuning
         | kinda like Stable Diffusion LORA community... albeit a much,
         | much smaller one.
        
         | dragonwriter wrote:
         | > If you're wondering how it's so fast and cheap and they can
         | generate variants so easily
         | 
         | I assume its cheap because they are burning money to build a
         | business, its not fast at all, and the quality... sucks.
         | 
         | > And if you're wondering how it can look so good
         | 
         | I'm not.
         | 
         | I'm wondering why they're trying to get people to use something
         | worse than using a decent photorealistic SD1.5-based checkpoint
         | with some basic prompt templating.
         | 
         | Not saying GANs can't be awesome, just that this site isn't
         | what I'd use to make that case.
        
           | mrguyorama wrote:
           | Looking at the poses, it feels optimized for generating porn,
           | but one example someone showed had a child's face (good god
           | please don't let your "AI" system generate child anything if
           | you want to sell it for porn purposes), and another user
           | noted that their attempt error'd out because it "detected
           | nudity", even though other users get given a nude model by
           | default.
        
             | stavros wrote:
             | What exactly is the argument against AI-generated child
             | porn? Are we worried that it will somehow turn people into
             | pedophiles, or do we not want to take jobs away from actual
             | children?
        
               | katabasis wrote:
               | 1. You are potentially giving a shield of deniability to
               | people who create or distribute real CSAM because now
               | they could claim that the images are just AI generated
               | and therefore "harmless"
               | 
               | 2. Efforts to stamp out real child abuse may be
               | undermined by a flood of AI-generated false positive
               | imagery
               | 
               | 3. When people see something over and over again they
               | start to think that it's normal. AI generation of this
               | kind of material (something which can be done at a huge
               | scale) risks normalizing the sexual abuse of children.
               | 
               | I'm sure there are many other arguments beyond these.
        
               | tempestn wrote:
               | To expand on what I take as your implied argument- Some
               | (small) percentage of people are pedophiles, meaning
               | they're attracted to children. Presumably they can't help
               | that, just as others can't change their sexual
               | preferences. Clearly acting on this urge with an actual
               | child is wrong. That's true whether it's directly
               | assaulting a child, or consuming child porn, as that
               | market encourages others to exploit children to generate
               | it. However, if it is possible to produce CP without
               | involving actual children, it could provide an outlet for
               | those desires that would reduce demand for actual CP, and
               | thereby reduce incidents of children being abused to
               | produce it.
               | 
               | One could argue that such an outlet could even reduce
               | incidents of direct sexual assault of children by
               | pedophiles, but there is also a counter-argument that it
               | would instead serve to "whet the appetite" and encourage
               | such behaviour. And of course there are other counter-
               | arguments; it could make actual CP more difficult to
               | detect, for one. Finally there is the argument from the
               | perspective of fundamental morality, that depicting
               | children in a sexual manner is wrong in and of itself,
               | and therefore the various potential effects are
               | irrelevant. (Much like it's wrong to murder an innocent,
               | even if you could harvest their organs and save five
               | others as a result.)
        
               | hackinthebochs wrote:
               | It's interesting to notice when utilitarian arguments are
               | accepted and when they're rejected. The argument offered
               | in favor of abortion without limits tends to be that
               | women will get abortions regardless, they will just be
               | dangerous. Presumably the greater good is served by
               | allowing abortions despite the moral issues surrounding
               | killing fetuses/unborn children. I have no trouble
               | imagining many people supporting such a utilitarian
               | argument for abortion but not for generated CP. Though I
               | have a hard time making the distinction intelligible.
        
               | stavros wrote:
               | That's a good summary, thanks. I think AI-generated will
               | lead to actual child porn not making financial sense
               | (hopefully, anyway). I also don't think that the
               | "whetting the appetite" argument is true, from other
               | areas I've seen (eg playing violent games doesn't lead
               | you to becoming a murderer), but I have no data on that.
        
               | ThrowAway1922A wrote:
               | > What exactly is the argument against AI-generated child
               | porn?
               | 
               | Currently? The fact that all the models need training
               | data and the law will see that as victimizing the people
               | who were used in the data set be they adults of children.
               | 
               | Overall? The fact that it's disgusting and pedophiles
               | deserve things which I can say IRL and everyone agrees
               | with, but on HN will get me banned.
               | 
               | Many countries ban underage anime porn too. Children and
               | their likeness are off limits.
        
               | richie_adler wrote:
               | Far from me to defend pederasty, but I'm quite sure I
               | would disagree with the thing you wouldn't want to
               | publish, RL or not.
        
               | CapitalistCartr wrote:
               | The argument against it is that the police and
               | prosecutors don't care about your arguments.
        
               | dragonwriter wrote:
               | > What exactly is the argument against AI-generated child
               | porn?
               | 
               | As something you generate in a photorealistic image
               | generator you are building a business around?
               | 
               | The fact that it is a serious crime in many jurisdictions
               | and, even where it isn't, photorealistic child porn
               | images that get noticed anywhere or going to result in
               | uncomfortable conversations for everyone involved in the
               | process of establishing that they aren't evidence of a
               | crime.
        
               | blitz_skull wrote:
               | Probably moral depravity if I had to guess. Not sure why
               | we even need "an argument" against it. It's pretty self-
               | evidently wrong.
        
               | wizofaus wrote:
               | What if it turned out it were the only effective way to
               | prevent people engaging in real paedophilia?
        
               | JackFr wrote:
               | Prisons?
        
               | Jolter wrote:
               | But it's not.
        
               | trehalose wrote:
               | That's a very big "what if". What data could demonstrate
               | that to be true or false?
        
               | [deleted]
        
               | masfuerte wrote:
               | Under English law creating a rough hand-drawn child porn
               | sketch for your own amusement is a serious crime. I don't
               | understand the rationale for this, but people should be
               | aware that if they use a porn generator and it spits out
               | an image that looks like CP then they will have committed
               | an offence in England.
        
             | dotancohen wrote:
             | I don't know if the thing in the crotch is a penis or a
             | scrotum, but it is definitely NSFW:
             | 
             | https://images.generated.photos/0wV1dBnZ15hGneEfqfZT7SdEIil
             | l...
             | 
             | My prompt was simply "Standing in front of a rocket.".
        
               | oniony wrote:
               | That image is so full of wtf
        
         | GaggiX wrote:
         | >they're too unstable', a widespread myth
         | 
         | >See for example BigGAN
         | 
         | I remember when you try training a BigGAN model on anime
         | images, the quality was bad. Now look at this example, one
         | single GPU, 1.5M images with a diffusion model:
         | https://medium.com/@enryu9000/anifusion-diffusion-models-for...
         | ,the difference in quality is absurd, you can say this or that
         | is not true but the quality speak for itself, obtaining good
         | quality on complex distribution is much easier with a diffusion
         | model than a GAN.
         | 
         | For example in the case of the site linked they have
         | conditioned the model on poses because you're not going to get
         | anything close to be coherent without them with a simple
         | StyleGAN as they say they're using.
        
           | Tyr42 wrote:
           | Broken link?
        
             | GaggiX wrote:
             | Fixed thx
        
           | gwern wrote:
           | > I remember when you try training a BigGAN model on anime
           | images, the quality was bad
           | 
           | Because there was a bug in the code, in a part unrelated to
           | the GAN itself.
           | 
           | > the difference in quality is absurd
           | 
           | Yes, it _does_ help to train on anime with code that isn 't
           | buggy. (BTW, Skylion was getting good results with GANs on
           | anime similarly restricted to centered figures like those
           | samples, he just refuses to ever publish anything.)
        
             | GaggiX wrote:
             | So you believe that without the bug you would be able to
             | come close to the quality of the diffusion model I have
             | linked? I'm not even asking about using the same compute (1
             | GPU for ~1 month) but if you just believe BigGAN can come
             | close to that in general.
             | 
             | Also the bug is probably related to the added complexity of
             | training a GAN model in comparison to a diffusion model.
        
         | dublin wrote:
         | How it can look so good? ROFL!! It just created a guy with a
         | hand coming off his left ankle in place of a foot, and toes or
         | fingers or something poking out the end of the show on his
         | right foot! https://generated.photos/human-
         | generator/64e682308448b8000c5...
        
         | lacoolj wrote:
         | maybe it was fast 58 minutes ago but apparently it is now at
         | peak capacity. even if you don't get rejected, a new image
         | takes minutes
        
         | nbardy wrote:
         | You're overstating the simplicity of a scaling a GAN well.
         | 
         | GigaGAN is the best quality out of those and requires 7 loss
         | functions and is incredibly complicated.
         | 
         | Sure GANs can scale, but Diffusion models are drastically
         | easier to scale.
        
           | gwern wrote:
           | No, I'm not. BigGAN did fine on scaling up to JFT-300M with
           | basically no changes beyond model size and a simple
           | architecture. This is also what we were observing, even with
           | a buggy BigGAN implementation. GigaGAN is the best quality,
           | but that's mostly because it's also the biggest; as Table 1
           | shows most of the gains come from various kinds of additional
           | scaling. (And this is moving the goalposts from the usual
           | assertion that "GANs _can 't_ scale" to "they're harder to
           | scale"; note the self-fulfilling nature of such assertions.
           | Considering how there is next to no GAN scaling research,
           | these results are remarkable and show how much low-hanging
           | fruit there is.)
           | 
           | Diffusion models are only 'drastically easier to scale'
           | because researchers have spent the past 3 years researching
           | pretty much nothing _but_ diffusion models, discovering all
           | sorts of subtle issues with them and how to make them scale,
           | which is why it took them so long to become SOTA, and why
           | massive architectural sweeps like
           | https://arxiv.org/abs/2206.00364#nvidia were necessary to
           | discover what makes them 'easier to scale'. If this level of
           | brute force and moon math is 'easy', lord save us from any
           | architecture which is 'hard'!
        
             | GaggiX wrote:
             | Researchers have spent several years try to even create a
             | GAN that can fit well a distribution made of aligned faces
             | (resulted into StyleGAN1/2); with a simple unet with
             | e-objective and cosine schedule you can fit much complex
             | distributions, still using one loss: L1/L2.
             | 
             | Reading your comments make me feel like that you believe
             | that just every researchers (even extremely smart dude like
             | Karras) just switch to diffusion models because they are
             | idiots, they should have instead focus on GANs and today we
             | will have GANs that are as powerful or more than the
             | diffusion models we have today and also work one step; this
             | is just a weird delusion. Diffusion models are just simply
             | much easier to train (just a L1/L2 loss in most cases),
             | write (for example your buggy BigGAN implementation), they
             | usually work out-of-box on different resolutions and aspect
             | ratios, you can just finetune them if you want to create an
             | inpainting model; and for what is right now you just need
             | much less compute to reach a good image coherency or maybe
             | just reaching a coherence that as not been achieved by GAN
             | models; like I would be curious even on a small scale
             | experiment what a GAN (with ~55M parameters) would be able
             | to perform after a 1-day/2-day GPU time of training on
             | Icon645 dataset, because my diffusion model I can assure is
             | much better than I could have imagine while being trivial
             | to implement (I just implemented a Unet as I remember one,
             | nothing rigorous and of course no architecture sweep).
        
             | ShamelessC wrote:
             | > Diffusion models are only 'drastically easier to scale'
             | because researchers have spent the past 3 years researching
             | pretty much nothing but diffusion models
             | 
             | This is what tends to happen when you find a superior
             | method.
             | 
             | GAN's are fine, they have plenty of promise for tasks
             | requiring rapid inference. But diffusion models beat GANs
             | on robustness and image quality every time.
        
         | sebzim4500 wrote:
         | >And if you're wondering how it can look so good
         | 
         | I don't think anyone is wondering this, especially if they are
         | used to playing with diffusion models.
        
         | motoboi wrote:
         | What I'm really wondering is how can this be free. What is the
         | business model here?
         | 
         | Are they using me to refine the model in some way?
        
           | htrp wrote:
           | Human evaluation is super expensive so yes. Seeing which
           | sessions you discard and keep alone is worth the compute
           | time, especially if it's GAN based.
        
           | irrational wrote:
           | It says free for non-commercial. If commercial, contact us. I
           | assume they plan on paying the bills with commercial work.
        
         | syntaxing wrote:
         | > everyone knows GANs don't work because they're too unstable
         | 
         | Is that a wide spread myth? I thought it's widely accepted that
         | GAN is really good generating these artificial pictures (it's
         | what started DeepFake after all) when you know your model's
         | "button". Similar to how this uses GAN since they have a model
         | "boundary condition". While humans are diverse, we have a set
         | of repeatable features (two legs, two arms, etc). Diffusion
         | models are great because you can control the latent space with
         | something way more generic, like text hence why it's been so
         | much more mainstream.
         | 
         | Edit: actually I might be misremembering, I think Deepfake used
         | VAEs?
        
           | gwern wrote:
           | It is very widespread. You will see people in this very
           | thread dismissing GANs as fundamentally failed, and hotly
           | objecting to any kind of parity, even if they have to fall
           | back to 'well ok GANs do scale, but they're more
           | complicated'. I also have some representative quotes in my
           | linked draft essay from various papers & DL Twitter
           | discussions. (Another way to put it would be: when was the
           | last time you saw someone besides me asserting that GANs can
           | scale to high-quality general images and are not dramatically
           | inferior to diffusion? I rest my case.)
        
             | ShamelessC wrote:
             | Sounds very important to you that you don't have to change
             | the premise of your essay or ever admit you're wrong. No
             | one is dismissive of GAN's here without justification.
             | They're fine. They don't beat diffusion, but they're fine.
             | 
             | You come across as severely, _severely_ biased and
             | reactionary.
        
       | tomcam wrote:
       | "Want more generated people?" is the most 2023 ad headline yet
        
         | phyzome wrote:
         | "Currently, we do not have any limits to the number of humans
         | you can generate."
         | 
         | This has widely been seen as something of a problem,
         | environmentally.
        
       | AmazingTurtle wrote:
       | NSFW? Also.. WTF with their detection algorithm, this is easily
       | abusable. This was the first image I was prompted with
       | https://generated.photos/human-generator/64e65d5a8448b8000b5... I
       | have not changed any of the parameters, they were automatically
       | generated on /new
        
         | joker_minmax wrote:
         | This is the thing that comes to you if you take too much
         | Benadryl at noonday.
        
       | tomrod wrote:
       | Wow, this went from reasonable to "holy crap that's nude" without
       | any prompting real fast.
        
       | ramoz wrote:
       | NSFW
        
         | the8472 wrote:
         | quite the opposite. I get a lot of the outputs filtered without
         | any NSFW prompting.
         | 
         | > We detected that generated image contains nude content. Try
         | changing parameters.
        
           | jonnycomputer wrote:
           | oh, in a few iterations i got a nude sexy adult woman.
           | clearly they're at risk of generating child porn (you can
           | change the age to child or teen, though for obvious reasons I
           | haven't tried it).
        
           | ramoz wrote:
           | Not _quite_ the opposite.
           | 
           | I clicked the female generation, and got a porn model posing
           | nude. Without any provided guidance other than the clickable
           | buttons.
        
       | hn_throwaway_99 wrote:
       | I get that most of these are hilarious (this is my favorite
       | comment on HN in some time,
       | https://news.ycombinator.com/item?id=37239909 ). But still, I
       | find this incredibly frightening. These are only going to get
       | better. Does anyone doubt that in a couple years time (if that)
       | we'll be able to put the image of any known public person into
       | whatever generated photo we want, which would be
       | indistinguishable from reality? We're not that far already (see
       | the Pope in a puffy jacket).
       | 
       | My only hope is that this extreme enshittification of online
       | images will make people completely lose trust in anything they
       | see online, to the point where we actually start spending time
       | outside again.
        
         | declan_roberts wrote:
         | The good news is that legal courts have already lost faith in
         | all things digital imagery, and has for a good long while.
         | They're actually way ahead of the curve.
        
         | ChatGTP wrote:
         | I think already a thing?
        
         | mmh0000 wrote:
         | We're basically there right now betweem Deepnude[1] and
         | Photoshop.
         | 
         | [1] (NSFW, seriously.) https://deepnude.cc/
        
           | tennisflyi wrote:
           | Not sure if these should have light brought upon them or stay
           | under rocks.
           | 
           | [2] (NSFW, seriously.) https://undress.app
           | 
           | [3] (NSFW, seriously.) https://porn.ai
        
       | corey_moncure wrote:
       | Probably shouldn't have made every individual adjustment to the
       | gen parameters require a generation round-trip to persist them
        
       | aubanel wrote:
       | Wow, the "one more click" effect is strong with that one... I did
       | not expect anything useful to come out of experimenting with
       | this, yet here I still am half an hour later. Congrats to the
       | makers, it's impressive!
        
       | chewmieser wrote:
       | Some of the generated models were pretty damn good but without
       | any additional prompting I ended up with the standard oddities
       | like multiple limbs.
       | 
       | I like the UI functionality though. easy to dial in what you're
       | looking for
        
       | generaltsos wrote:
       | At last, a way to complete the AI-generated cycle that
       | https://thispersondoesnotexist.com/ started.
        
       | 4ec0755f5522 wrote:
       | If you refuse their tracking and marketing cookies it redirects
       | you to google.com. Classy.
        
         | bee_rider wrote:
         | I wonder if their business model is tracking and marketing.
        
         | anigbrowl wrote:
         | I'm surprised browsers don't offer something like Docker so
         | that each site is isolated to its own virtual environment.
        
           | ormax3 wrote:
           | private/incognito window?
        
             | anigbrowl wrote:
             | That forgets the whole session when you close it. I meant a
             | way to isolate websites for tracking purposes but also
             | continue to use it over time rather than throwing away all
             | cookies.
        
           | the8472 wrote:
           | https://addons.mozilla.org/en-US/firefox/addon/temporary-
           | con...
        
             | bunnybender wrote:
             | The creator and maintainer of that extension has passed
             | away in January.
             | 
             | https://github.com/stoically/temporary-
             | containers/issues/618
        
           | evan_ wrote:
           | Chrome profiles work exactly like this, you can set up any
           | number of profiles and they all have their own
           | configuration/sessions etc.
           | 
           | I use home and work profiles on my laptop for instance, works
           | really well.
        
         | exceptione wrote:
         | That violates EU law and you can absolutely get a fine for this
         | behaviour. As a digital service offerer you can ask the user
         | for permission to track non-essential information about the
         | user, but your service should work the same, without regard for
         | if that user says yes or no.
         | 
         | If this service is hell bent on raping your privacy, they will
         | have to limit their offerings to mostly those living in
         | dictatorships and immature democracies.
        
         | squeaky-clean wrote:
         | You can just hit the back button and use the website without it
         | popping up again. I refused but they're probably still
         | assigning cookies after I hit the back button.
        
         | [deleted]
        
       | jrflowers wrote:
       | Finally a website that unprompted answers the question "What if
       | Wednesday Addams had enormous breasts?"
       | 
       | Edit: lol https://generated.photos/human-
       | generator/64db2561ba3ed6000ca...
        
         | joker_minmax wrote:
         | It gave you a Sims character?
        
           | jrflowers wrote:
           | That was after several Barbie dolls. Great website.
        
             | jeroenhd wrote:
             | The prompt for the image you linked says "in sims world".
             | It's in the bottom left field.
        
               | dragonwriter wrote:
               | The thing is if you are advertising generating infinite
               | photorealistic humans, your automatically generated
               | prompts should probably not be things that do not serve
               | that end.
        
               | jrflowers wrote:
               | That's cool, I got there through hitting the refresh
               | button.
        
       | dcdc123 wrote:
       | Be careful at work...it sometimes generates a realistic nude even
       | with clothing selected.
        
       | chefandy wrote:
       | Incredible. In the time I'd spend creating one image that exactly
       | fits my or my client's needs or buying a high quality stock
       | photo, I can generate literally millions of photorealistic,
       | unappealing, images that would require a skilled commercial
       | artist to make useful for all but the most throwaway uses. What
       | about for some high-volume throwaway use case? I generated like 5
       | images before I got a 3/4 shot instead of a full-body shot.
       | _bzzzzzzzt._
       | 
       | Trying to 'wing it' with engineers doing what designers should be
       | doing is a bad enough when you're just making regular interfaces,
       | but when you're trying to sell a commercial art product, you need
       | people with subject matter expertise. No matter how cool the
       | technology is, and no matter how well it theoretically serves a
       | commercial art customer base, if you're selling art, it's going
       | to be critiqued as such. Hope you've got a thick skin.
        
         | mrguyorama wrote:
         | But think of how much easier this makes believeable spam and
         | scams!
        
       | endisneigh wrote:
       | Not bad at all, but what's the main use case?
       | 
       | The site lists all of these things you can do, but are those
       | things people needed or wanted? Is the idea to replace stock
       | photography of people?
        
         | Icons8 wrote:
         | Stock photography, better selfies, or simply fun. Also, people
         | always invent some use creators never thought about.
        
       | drik wrote:
       | FYI: the site places 3 cookies on the visitor's computer without
       | consent
        
         | [deleted]
        
         | tzs wrote:
         | Do they need consent?
         | 
         | One cookie looks like it just records whether or not a tooltip
         | that they want to show to first time users has been shown. The
         | other two appear to be some kind of session cookies.
         | 
         | They might count as strictly necessary cookies.
        
       | colinrand wrote:
       | What I find sketchy is that it is not easy to find out who is
       | behind this service. The norm is an about us or a link to a
       | parent site. Briefly skimmed the legalese (ToS & Privacy) and
       | still not clear who these people or where they operate from. The
       | linkedin link shows 8 people working there, mostly in BD from
       | outside the US.
       | 
       | I don't think there is a nefarious purpose going on, i.e. getting
       | people to sign up and stealing their info or payments, etc.
       | However, it contributes to the erosion of trust on the internet.
       | You're no longer sure if you're talking to a real dog in pajamas
       | online or an AI pretending to be one.
        
         | lancesells wrote:
         | I find that a lot of Show HN (YC companies included) that make
         | it to the front page have the same problems. I usually don't
         | make comments on it but I find it crazy that someone would
         | launch either a paid product or something that takes your
         | private information without knowing where they exist or who
         | they are.
        
         | paint wrote:
         | It's also prominently asking you to upload a picture of your
         | face along the rest of the controls
        
       | TuringNYC wrote:
       | For a number of use cases, this would be most helpful if combine-
       | able with tools which move lips/cheeks to simulate speech.
       | However, the toolsets seem to be fractured at this point. Does
       | anyone have a good workflow for this?
        
       | bwooceli wrote:
       | Default human is a "Young Adult" woman, and the default "add
       | something" was "woman with tatoos". I changed ONE filter (from
       | young to Senior). It spun for about 20 seconds and then gave me
       | the same woman's face but older. She is also topless. I'm
       | impressed (?)
        
         | wedn3sday wrote:
         | I had very different default settings, so I think there's some
         | randomization going on here.
        
       | satvikpendem wrote:
       | How does it compare to https://photoai.com by Pieter Levels?
        
       | Zardoz84 wrote:
       | Wonderful dystopia we are creating
        
       | MPSimmons wrote:
       | What would a company do with 10,000 photorealistic photos of an
       | AI generated human... per month?
        
         | Icons8 wrote:
         | Train their models
        
         | DonsDiscountGas wrote:
         | On-demand generation of NPCs for video games? Or background
         | extras in movies?
         | 
         | Or maybe a people trying clothes on virtually.
        
       | wpwpwpw wrote:
       | really easy to jailbreak nudes
        
       | dvngnt_ wrote:
       | > Thanks to our advanced AI algorithms, you won't tell generated
       | humans from real people
       | 
       | If the images posted are the best they can do, then i have some
       | bad news from them
        
         | wredue wrote:
         | The first photo generated for me made everything look plastic.
         | Unnatural sharp lines on everything. Shadows from 5 different
         | directions.
         | 
         | It's laughable to call these "hyperrealistic".
        
         | sdflhasjd wrote:
         | Marketing taking it too far as usual. They certainly have
         | mastered peak uncanny valley though, I'm not really sure what
         | this is useful for.
        
         | function_seven wrote:
         | I can count on one hand the number of ways these photos fail.
         | That's right, 6 ways.
        
           | nocman wrote:
           | Count Rugen sees no problem with this.
        
             | function_seven wrote:
             | Eh. On one hand, I guess it's no problem at all. But on the
             | other hand...
        
           | paint wrote:
           | If you encode a binary digit for each biological digit on
           | your hand you can count up to 32 on one hand.
        
         | albert_e wrote:
         | The first image i generated was worse than that
         | 
         | A mermaid with plastic looking skin, and badly rendered ocean
         | water in background.
         | 
         | https://generated.photos/human-generator/64d6dde03af7f90007c...
        
           | irrational wrote:
           | Looks as realistic as every other real mermaid I've seen ;-)
        
           | klyrs wrote:
           | I got exactly that image too! I guess the "random human"
           | isn't so random. This calls their "real time" claim into
           | question...
        
       | klyrs wrote:
       | Amusing. My first two "random human" samples had completely
       | ordinary uncanny valley issues (eye was smushed and blurry,
       | weirdly shark-like teeth in child's mouth). But the third looks
       | pretty good! ...for a 90s era povray Barbie doll model.
       | 
       | https://generated.photos/human-generator/64d552c85263da00077...
        
         | dragonwriter wrote:
         | The text prompt for that image (one is generated for the
         | "random" images) is "barbie doll", so in this specific case its
         | not so much an imagegen problem as other parts of the app
         | design not matching the advertised behavior.
        
           | klyrs wrote:
           | Ah, funny. Their interface hid that box from me on my phone.
           | Weird choices all around.
        
       | ProjectArcturis wrote:
       | Seems rude how if you refuse their cookies they redirect you to
       | Google.
        
       | SrslyJosh wrote:
       | Refusing cookies redirects to Google? Kinda scummy.
        
       | DonsDiscountGas wrote:
       | Neat. I'd really like to have a setting for attractiveness, all
       | of these people look like models.
        
       | ozten wrote:
       | Congrats on the slick design!
       | 
       | Ethnicity: American
       | 
       | What does that mean in latent space and does this mostly
       | represent training bias?
        
         | Digit-Al wrote:
         | It's also got "Irish" but not "Scottish" or "English". Very
         | odd.
        
       | rendall wrote:
       | This was the first one I saw: https://generated.photos/human-
       | generator/64d67731568faa0007a...
        
       | ricardobeat wrote:
       | They are overselling the capabilities of their model a bit. The
       | boy posing has facial artifacts, and the first "human" I
       | generated is a painterly mermaid with a disjointed background.
       | Results from photoai.com or many models available on civit.ai
       | look a lot more realistic.
        
       | joker_minmax wrote:
       | Is it just me or is something kind of weird about all the breasts
       | on the example women? They all look really high-set - and
       | combined with the fact other people have gotten back nudes from
       | this tool (as shared in the comments below) - I'm thinking that
       | the dataset they used here was really catered to a "certain"
       | audience.
       | 
       | Edit to add: It's not fast, it's showing you repeats of stuff of
       | already made in the first try. Which is probably why I got 5 men
       | in tight pink shirts eating cake in a row. ???
        
       | sandgiant wrote:
       | The only thing I changed from the default parameters was Age ==
       | Teenager. That resulted in this error:
       | 
       | We detected that generated image contains nude content. Try
       | changing parameters.
       | 
       | Not sure what to make of this, but it feels wrong, somehow?
       | 
       | Edit: This was the prompt it generated for me on page load:
       | "Minerva McGonagall in Hogwarts, wearing Hogwarts robe and witch
       | hat" - https://generated.photos/human-
       | generator/64e650a39563e6000e0....
        
         | thomastjeffery wrote:
         | "teen" is a ubiquitous porn category that, in practice,
         | describes a body type, not age; similar to how "babe" almost
         | never means "infant".
         | 
         | I would be more surprised to get SFW results from that prompt,
         | considering the result would be based on more heavily regulated
         | (less common) photographs of minors.
        
           | mrguyorama wrote:
           | No, teen in porn absolutely means 18 and 19, or at least "I'm
           | '18-19' and definitely not a 26 year old"
        
         | Icons8 wrote:
         | Over self-censoring
        
         | izzydata wrote:
         | I got the same thing and am very confused. They are the ones
         | generating the image. Why did they generate porn if they don't
         | allow it? Also apparently clothed teenagers are now
         | pornographic? I think their image analysis needs some work.
        
       ___________________________________________________________________
       (page generated 2023-08-23 23:00 UTC)