[HN Gopher] Unlaunching the 12GB 4080
       ___________________________________________________________________
        
       Unlaunching the 12GB 4080
        
       Author : haunter
       Score  : 273 points
       Date   : 2022-10-14 16:28 UTC (6 hours ago)
        
 (HTM) web link (www.nvidia.com)
 (TXT) w3m dump (www.nvidia.com)
        
       | dragontamer wrote:
       | NVidia proving to everyone why EVGA quit being their partner.
       | 
       | This sounds horrific for the card manufacturers who have made a
       | ton of product and are now unable to sell.
        
         | karamanolev wrote:
         | Could it be that EVGA got a whiff of that 4080 12GB business
         | and noped out because of it? We only hear about it later,
         | because AIBs knew sooner?
        
           | usednet wrote:
           | Nvidia tells near nothing to AIBs (even pricing!) until
           | public release.
        
           | Melatonic wrote:
           | I think EVGA was probably unique in that the owner was
           | passionate about actually providing a damn good product and
           | service and at some point just got tired of it - hard to
           | blame them. Running a business is stressful and if you have
           | to become mediocre after striving to be the best it must just
           | seem pointless.
        
         | shmde wrote:
         | > NVidia proving to everyone why EVGA quit being their *bitch.
        
         | paol wrote:
         | The launch of the 4080s is still a ways off.
         | 
         | I doubt any partners had designs finalized, given the rumors of
         | how little time they are given to do that before each launch
         | (which is a problem in its own right, and one of the things
         | they are know to complain about).
        
           | lbotos wrote:
           | I thought they were supposed to land in "November". Are
           | manufacturers able to:
           | 
           | - source components - finalize packaging design - finalize
           | cooling design - assemble - ship
           | 
           | Graphics cards in 47 days? I'd expect at MINIMUM 90 days.
        
       | ajaimk wrote:
       | Now will Nvidia have the courage to re-release it as the 4070?
        
         | dannyw wrote:
         | They wouldn't make it that obvious. They'd weaken specs a
         | little, like 5% less Cuda cores or something.
        
         | kickofline wrote:
         | Aren't the chips already made, so they have to do something
         | with them?
        
           | wmf wrote:
           | Yes, they will presumably release it as a 4070 or 4070 Ti.
        
           | Razengan wrote:
           | This is why biological tech is superior: You can just eat
           | failed products.
        
             | dylan604 wrote:
             | Unless those failed products turn into something toxic
        
         | izacus wrote:
         | Sure, it'll just renumber it, sell it at the same price and
         | gamers will move their screaming to a new issue.
        
       | kickofline wrote:
       | Despite Nvidia's past actions, I think this was the right move.
        
       | wmf wrote:
       | Wow. So there is a limit to how much abuse the market will
       | accept.
        
         | WrtCdEvrydy wrote:
         | Every single reviewer has shit on this thing... Nvidia might be
         | semi-scared.
        
         | jayd16 wrote:
         | Why do you say that? They just pulled a cheaper sku leaving the
         | more expensive one, no?
        
           | wmf wrote:
           | The "4080 12GB" name was misleading almost to the point of
           | fraud because it led customers to think it would have the
           | same performance as a 4080. I don't object to the product
           | itself, just the name. Removing it from the market is the
           | right move. Maybe they'll relaunch it with a non-misleading
           | name.
        
           | S0und wrote:
           | Every hardware reviewer complained a about calling something
           | xx80 while the card performs as a xx70 is a scamy move.
        
           | jffry wrote:
           | In the past, NVIDIA launched the "3080" and the "3080 (12GB)"
           | where the memory capacity and bus width was the major
           | differentiator. (it also had like 3% more cores or
           | something).
           | 
           | The RTX "4080 12GB" and "4080 16GB" were much further apart
           | in cores (IIRC a 30% difference) and so naming them in the
           | same category in a way that suggests the RAM is the primary
           | difference, was widely seen as disingenuous.
           | 
           | Yes technically any consumer could lookup the specs, but that
           | doesn't still make it a dirty and dishonest move.
        
       | samvher wrote:
       | What a strange post - the name is not right, so it's unlaunched.
       | No indication of a new name? Am I supposed to conclude that the
       | whole product is canceled for now? If that's the case it seems
       | unlikely that the naming error is the whole reason.
        
         | mariusmg wrote:
         | Cards are already made, they will be launched in the future for
         | sure as RTX 4070Ti.
        
           | selectodude wrote:
           | Do we know that for sure? It's possible that the fake 4080
           | was for yield reasons but the yields are higher than
           | anticipated. May as well sell the same chips for more money
           | and "unlaunch" the shitty product.
        
             | tedunangst wrote:
             | It's a different die.
        
             | mariusmg wrote:
             | Yes, back in August warehouses already has stocks of RTX
             | 40xx cards.
             | 
             | This "reverse" has nothing to do with yields, nVidia
             | (rightfully) realized it's better to avoid a shitstorm
             | because this 4080 12GB version is ~30% slower than the
             | "real" 4080.
        
         | [deleted]
        
         | TheRealPomax wrote:
         | ABIs will have already made _tons_ of cards, boxed up and ready
         | to ship out, just contractually unable to sell them on-masse
         | quite yet. And now they _really_ can 't sell them until they've
         | reprogrammed and reboxed them to show up as (almost certainly)
         | 4070 instead of 4080 because that's essentially what they are.
         | The 4070 wasn't missing from the lineup: the GPU chip on the
         | 4080 12gb model is literally a completely different chip from
         | the one on the 4080 16gb.
         | 
         | (Think of it like Intel calling the 14th gen i5 "an i9". Or
         | heck, Exor deciding to label a Fiat sports car "a Ferrari 812
         | V4" while keeping the real thing "a Ferrari 812 GTS").
         | 
         | And of course, that's an _gigantic_ dick move because it costs
         | NVidia nothing to announce this, but probably means no one will
         | even be able to make a profit on the 4080 12GB cards they
         | already made (which a cynic might say was precisely NVIDIA 's
         | goal here, as plausibly deniable punishment dished out to the
         | remaining ABIs for daring to let one of their own disobey the
         | corporate overlord).
         | 
         | If EVGA hadn't already broken their partnership, this
         | definitely would have make them do so. "Thank you for jumping
         | through our hoops, after paying for the entire course yourself,
         | now make a new course because you displeased your master" is
         | not a great way to treat your partners.
        
           | Workaccount2 wrote:
           | Nvidias arrogance is going to do them in.
        
           | vhold wrote:
           | If true, it reminds me of the reputation that Jack Tramiel,
           | founder of Commodore computers, had in the 80s of screwing
           | over his suppliers to the point they would no longer do
           | business with him.
        
           | happycube wrote:
           | Intel all but rebadged dual-core i3's as i5's and i7's in
           | mobile for _years_.
        
         | Traubenfuchs wrote:
         | Yes, this is bizarre and cryptic. I kept rereading it a few
         | times because I thought I for sure missed them telling me what
         | gets renamed to what now.
        
           | FartyMcFarter wrote:
           | It seems pretty clear. As the title says, this is about
           | unlaunching something, not about renaming it or launching it.
           | 
           | Just forget the formerly named "12 GB 4080", it doesn't exist
           | for now.
        
             | kipchak wrote:
             | Don't the cards physically already exist though?
        
               | Rebelgecko wrote:
               | In a month or two they'll probably sell them as 4070 or
               | maybe 4070 Ti cards
        
         | dathinab wrote:
         | The demand was probably also not right, as it was just not a
         | very appealing offer.
         | 
         | They will probably see what AMD does and then rebrand them
         | maybe with some under or over volting tweaks.
        
         | WithinReason wrote:
         | Could this have to do with the demand they see for the 4090? It
         | was launched just 2 days ago.
        
           | neogodless wrote:
           | It's hard to trace logic through that. Demand for the $1600
           | RTX 4090 is from "money be damned, give me frames" consumers,
           | mixed with "can use for ML" professionals, and as such, are
           | not quite the same market segment as the $900 crowd
           | (remembering that the xx80 cards used to be $700 and much
           | closer to the top of the lineup in overall performance.)
        
           | tracker1 wrote:
           | It's a different die than the 4090 (and the 4080 16gb for
           | that matter), they already had a lot of them made, but may
           | redirect future production to more 4090 and 4080 12gb while
           | having the mfgs rebadge the 4080 12gb to say 4070 and just
           | sit on them until reboxing/rebadging.
        
       | seansmccullough wrote:
       | It seems like the Nvidia marketing team should have thought of
       | this before launching the product.
        
         | kipchak wrote:
         | "The risk I took was calculated, but man, am I bad at math."
        
         | dylan604 wrote:
         | Now now, you can't expect them to think of everything now can
         | you? /s
        
       | [deleted]
        
       | CobaltFire wrote:
       | I didn't expect to see this much of a mea culpa after the 1060
       | 3GB and 6GB being the same situation.
        
         | s_dev wrote:
         | Same -- I can appreciate there maybe confusion but this was
         | done by nVidia before. There is a precedent -- though I guess
         | with marketing the customer is always ultimately right.
        
       | ZiiS wrote:
       | I will get down voted to oblivion but if you consider yourself a
       | x080 customer and don't want less silicon in a 4080 12GB or the
       | price hike for the 16GB. You have a very viable option of just
       | keeping your current x080 for free. The 4090 is extremely good
       | value for an extremely small group of people wanting ML or
       | perhaps 8k gaming. Cards that will appeal more then a very select
       | few will be released latter.
        
       | helloworld97 wrote:
        
       | Havoc wrote:
       | What the hell is going on at nvidia? First the very transparent
       | not-a-4070 scam, then gigantic misjudgement on crypto demand
       | drop, then the evga mess, then this "unlaunch" that looks like an
       | intern post.
       | 
       | The tech still seems good (e.g. DLSS) but corporate decision
       | making seems in freefall
        
       | UberFly wrote:
       | Those pics of people waiting outside... yes please Nvidia, more
       | abuse. EVGA seems to be a pretty decent company, and they quit
       | Nvidia. I'm thinking they're the canary in the coal mine.
        
       | tengbretson wrote:
       | They're really doing their best to jerk around their AIB partners
       | as much as possible, aren't they.
        
         | WithinReason wrote:
         | From the reviews it already seems there's not much point to AIB
         | cards anymore, EVGA got out at the right time.
        
           | kouteiheika wrote:
           | > From the reviews it already seems there's not much point to
           | AIB cards anymore
           | 
           | There is, because in certain parts of the world NVidia
           | refuses to sell their founders editions, so AIB cards' all
           | you can get.
        
           | formerly_proven wrote:
           | One of those AIB 4090s with a child-sized cooler at 50-70 %
           | power limit should make for a relatively quiet GPU.
        
             | WithinReason wrote:
             | The Founder's Edition coolers are now much better quality
             | than previous generations:
             | https://www.youtube.com/watch?v=CmUb9sDS9zw
        
       | josmala wrote:
       | That 12GB 4080 had compute bit above 3090 ti. But memory
       | bandwidth between 3070 and 3070 ti.
       | 
       | If it the cache hit rate is similar to Navi2 of similar size. It
       | would have effective bandwidth similar to 3090 in 1440p and way
       | above any previous gen card in 1080p but at 4k it would be around
       | 3070 ti territory in terms of memory bandwidth. So a great 1440p
       | card natively and to those who are willing to use DLSS to
       | upsample from that to higher resolutions.
        
         | sliken wrote:
         | What had me worried is that the 4080 12GB has a 192 bit wide
         | memory interface, same width as the RTX 3060 and less than the
         | RTX 3060 Ti.
         | 
         | Sure they worked on the caches to improve performance, but I
         | always worries that some games will do poorly with the caches
         | and have terrible performance. After all it's the lowest frame
         | rates that are most noticeable, not the max or average.
         | 
         | Charging $900 for a RTX 3060 memory width is insane.
        
       | sergiotapia wrote:
       | Weird post seemingly written by an intern during lunch. The
       | pictures of "lines" scream desperate, "see! people want our
       | cards!". I think nvidia is in deep trouble.
        
         | protomyth wrote:
         | A funny point is that the line depicted in the second picture
         | down (Micro Center - Burlington) is not exactly unique to an
         | NVIDIA launch. Micro Center often has long lines for a variety
         | of manufactures new parts. That place is basically the biggest
         | outlet in the area for the DIY crowd.
        
         | ok123456 wrote:
         | The GPU cost too damn high.
        
           | cheschire wrote:
           | Look at what the predecessors cost. I chose the higher 3080
           | 12GB for comparison.                 1080  $700   2016
           | 2080  $800   2018       3080  $800   2020       4080  $1200
           | 2022
           | 
           | I'm sure there's some justification about why it's 50%
           | increase in price, but if it's a necessary increase then even
           | releasing just seems tone deaf given the state of the world
           | right now.
           | 
           | 10 series - https://www.nvidia.com/en-
           | us/geforce/news/geforce-gtx-1080/
           | 
           | 20 series - https://nvidianews.nvidia.com/news/10-years-in-
           | the-making-nv...
           | 
           | 30 series - https://nvidianews.nvidia.com/news/nvidia-
           | delivers-greatest-... and
           | https://en.wikipedia.org/wiki/GeForce_30_series
           | 
           | 40 series - https://www.nvidia.com/en-us/geforce/graphics-
           | cards/40-serie...
        
             | dylan604 wrote:
             | Clearly, they're making up for the money left on the table
             | for not increasing the price from 2080->3080.
        
               | ErneX wrote:
               | 2080 is when they introduced RT and the card performed in
               | raster gfx about the same as the 1080. And support for RT
               | was coming in the future so it kind of makes sense it was
               | priced the same.
        
             | izacus wrote:
             | Now put the performance numbers next to them.
        
         | dymk wrote:
         | That's what stuck out to me as well. This legitimately reads
         | like it's by someone who's never written marketing copy. And
         | what the heck is with the last picture of the box buckled into
         | a desk chair?
        
           | maxsilver wrote:
           | > And what the heck is with the last picture of the box
           | buckled into a desk chair?
           | 
           | The blog post reads like an Intern who is/was a Redditor
           | wrote it. The "GPU buckled into a seatbelt" is an old-but-
           | common PC Builder Meme/Tradition online (particularly
           | Reddit).
           | 
           | https://www.google.com/search?q=gpu+seatbelt&tbm=isch
           | 
           | https://www.reddit.com/r/nvidia/comments/4skdlg
           | 
           | https://www.reddit.com/r/pcmasterrace/comments/4pt9nl
           | 
           | https://www.reddit.com/r/pcmasterrace/comments/kmglh6
        
           | zanecodes wrote:
           | It's a car seat, not a desk chair.
        
             | dylan604 wrote:
             | Oh, but what an amazing desk chair could be! Heated seat!
             | Motorized recline, height, position adjustments! Seatbelts
             | for those intense coding sessions (or, more realistically,
             | desk chair races). Hell, managers will love them too as
             | they already have the butt-in-seat sensors to know if their
             | underlings are "working"
        
         | waking_at_night wrote:
         | Agreed. This post is really weird, especially coming from a
         | multi-billion company. Perhaps I'm too used to corpo-speak, but
         | this coming from the opposite end does feel kinda fishy.
        
       | mckirk wrote:
       | What is going on over there? First the confusing move to release
       | two 4080s, now a confusing press release about unreleasing one of
       | them, that itself reads like somebody released it prematurely?
        
         | amelius wrote:
         | Perhaps they are too focused on the technology side of things,
         | as opposed to business. Kind of refreshing, actually.
        
         | jamiek88 wrote:
         | I kept looking for the other half of the post!
         | 
         | Did the poster have a gun to his head or something?
         | 
         | "Okay, okay, I've unlaunched the 12gb put the gun down, Linus."
        
       | bagels wrote:
       | This, Nvidia competeing against their partners with Nvidia
       | branded boards, and the evga article a few weeks ago make it seem
       | like being in partnership with them must be dreadful.
       | 
       | These card makers now have to sit on inventory and reprint boxes
       | and repackage everything?
        
         | NavinF wrote:
         | The 4080 12gb release date hasn't been announced yet so it's
         | likely that no boxes were printed or packed.
         | 
         | In the long run this is a good move considering how idiotic it
         | is to give the same name to products with different dies and a
         | ~30% performance difference.
        
           | tracker1 wrote:
           | You're talking about supply lines across an ocean... that's a
           | lead time of several months... not to mention the time it
           | takes for making injection molds, etc. these cards were
           | already made and badged... They may just be sitting on
           | pallets waiting to be boxed, or may already be in boxes
           | and/or shipped.
           | 
           | At the very least there's probably some recall operations to
           | ship back, take off the shrouds and put on new shrouds for
           | the rebadge, if not also rebox.
        
         | mikhailt wrote:
         | > These card makers now have to sit on inventory and reprint
         | boxes and repackage everything?
         | 
         | If it is a simple rename, they can just prefix a new sticky
         | label on top of the "4080 12gb" with "4070" for an example.
        
           | tracker1 wrote:
           | More than that is the injection molded shrouds and backplates
           | on many of the cards in question, not to mention reboxing and
           | recalling existing/shipped inventory.
        
         | dralley wrote:
         | I mean, they've pissed off nearly everyone they've partnered
         | with in the past. Apple, Microsoft, Sony, the Linux kernel,
         | etc.
        
           | ElectricalUnion wrote:
           | I am surprised Nintendo isn't pissed with them yet, they're
           | also a company that pissed off nearly everyone they've
           | partnered with in the past.
        
             | Wohlf wrote:
             | Maybe they understand each other better because both are
             | incredibly greedy.
        
           | verall wrote:
           | Apple dropped NV because it's too expensive and apple is
           | ruthless about their BOM.
           | 
           | Whats the story about Microsoft and Sony?
        
             | dralley wrote:
             | Apple dropped Nvidia after Nvidia chips were failing en-
             | masse in the late 2000s and early 2010s, and Nvidia tried
             | to publicly pin the blame on Apple, even though Nvidia
             | chips in laptops by other manufacturers were also failing.
             | 
             | https://semiaccurate.com/2010/07/11/why-nvidias-chips-are-
             | de...
        
             | trashcan wrote:
             | This is the best article I can found about Microsoft
             | choosing to end its relationship with Nvidia for Xbox:
             | https://www.sfgate.com/business/article/Nvidia-loses-Xbox-
             | Mi...
        
       | shmde wrote:
       | Nvidia caught with their pants down expecting consumers to be
       | stupid and not knowing their 4080 12gig card was just a 4070ti
       | wearing a makeup. They 100% knew it, definitely was NOT a
       | mistake. Just tried to sell a lower spec card masquerading as a
       | better card to your avg customers who would not know any better.
        
       | gjsman-1000 wrote:
       | I can't imagine how much swearing there is going on in meetings
       | with NVIDIA's "partners"[1].
       | 
       | [1] It is increasingly obvious NVIDIA only views them as
       | sycophants.
        
         | ErneX wrote:
         | EVGA decided to quit making GPUs so yeah things must be
         | intense.
        
       | [deleted]
        
       | blake929 wrote:
       | My two cents on Nvidia's rollback reasoning:
       | 
       | 4080 12GB was universally panned. The 40 series launch also got
       | heat for price gouging, particularly the higher cost for the low
       | end of the launch (4080 12GB). They had to raise the cost of the
       | lower end of the 40 series though if they wanted to maintain the
       | value of the 30 series cards and clear out the remaining
       | inventory. They couldn't just release a true 4070 for a true 4070
       | price. While the name was obviously bad, it seems likely that
       | they wanted to obscure the release of a 4070-quality chip for a
       | 4080-price while attempting to sell off remaining 30 series. Pure
       | speculation: maybe they were hoping a "cheaper 4080" would come
       | across to the uninformed as Nvidia trying to lower the entry cost
       | for 40 series rather than raising it through an expensive 4070.
       | 
       | Two potential reasons for the rollback come to mind: 1) higher
       | than expected 4090 demand means they can wait to launch a 4070.
       | 2) higher than expected heat for the thinly veiled 4070 price
       | gouging made it worth it to wait on the release since it helps
       | sell more 30 series cards by raising the entry price for a 40
       | series while getting better PR in the process.
        
         | bitL wrote:
         | 4090 is in stock in all shops around, so I don't think the
         | demand was higher than expected. Zen 4 is also everywhere but
         | not selling.
        
           | mikhailt wrote:
           | It depends on the location as well. In US, it's out of stock
           | everywhere.
           | 
           | In EU, where the price and electricity prices are much
           | higher, it appears to not sell out like it did in EU. (As far
           | as I know).
        
             | TillE wrote:
             | I don't see a 4090 in stock anywhere in Germany (big PC
             | gaming culture), except for scalpers on eBay.
        
           | neogodless wrote:
           | > Zen 4 is also everywhere but not selling
           | 
           | I was shocked to learn today that B650 boards are available.
           | That information didn't seem to make it anywhere near my
           | usual technology news channels!
           | 
           | But... they start at $170 for a barebones motherboard. Having
           | spent $200 not too long ago for a well-rounded mid-range X570
           | board, I find $170 for the starting line up quite steep. And
           | it's unlikely builders want to pair their $300+ Zen 4 chips
           | with the most basic board available.
           | 
           | The barebones right now would be $170 + $300 + $90 (16GB
           | DDR5) = $560 before accounting for the rest of the parts
           | (like a GPU).
        
             | tracker1 wrote:
             | I'm waiting until around March/April... hoping that prices
             | settle by then, also considering rDNA3 and hoping to see an
             | R9 7950X3D model by then before making final decisions on a
             | next build. Also, right now there's not really any good
             | options for higher speed DDR5 at higher quantities and am
             | curious to see which boards support ecc by then.
        
             | sliken wrote:
             | Yup, doubling the memory bandwidth, doubling the memory
             | channels, and doubling the PCIe bandwidth, and switching to
             | DDR5 is placing a premium on the new AM5 platform for AMD.
             | Similar happened with the Alder lake launch, which had the
             | same upgrades and combined with sky high DDR5 memory
             | prices.
             | 
             | Just wait a few months, pioneers are the ones that get the
             | arrows (high prices) in the back side.
        
         | winkeltripel wrote:
         | It's actually even worse. If you look at the core counts, the
         | 4080 12g is a _60 tier card, and the 4080 16gb is a_ 70 tier
         | card. The 4090 has a much better power to cost ratio.
        
           | _hypx wrote:
           | Same with memory bus. The 3060 Ti, 3070 and 3070 Ti all had a
           | 256-bit bus. Only with the 3060 did it drop to a 192-bit bus.
        
             | happycube wrote:
             | And the 3060ti has less memory (8 vs 12GB) - for many non-
             | gaming uses (i.e. deep learning/ML) that makes it much less
             | useful.
        
         | mikepurvis wrote:
         | > higher than expected 4090 demand
         | 
         | Has anyone done analysis on this? My layman's assumption is
         | that with the shortages and gouging/scalping over the past two
         | years, an awful lot of people decided to tough it out on their
         | 10-, 16-, and 20- series cards, and now the narrative is that
         | the shortages are over (whether or not the actual prices really
         | back that up) and those people who skipped a generation or two
         | are now emotionally and financially prepared to "treat"
         | themselves to the new top of the line.
         | 
         | If this is it, though, it seems weird that it could really have
         | caught Nvidia by surprise. Don't they have driver-level
         | telemetry that would show them all those older cards plugged
         | into new-chipset motherboards, and could give them some
         | indication of demand?
        
           | injinj wrote:
           | China fomo? Are these good enough to fill the needs of AI
           | workloads of the datacenters which can no longer get the next
           | gen NVIDIA GPUs?
        
             | mattnewton wrote:
             | Benchmarks I have seen absolutely put them above existing
             | workstation cards in everything except memory. If your
             | model and embeddings fit into 24gb vram, it absolutely
             | makes sense to buy this over an a5500 or even a a6000
        
           | cinntaile wrote:
           | Plenty of people do have the money to spend on these cards.
           | It's entirely possible that it's really just a vocal minority
           | that refuses to pay these prices. I agree with the
           | grandparent and the 4090 probably sells better than expected.
           | The card performs well too.
        
             | AdrianB1 wrote:
             | We are in an economic recession, so even if the people
             | _have_ the money, many are _not willing_ to spend it on a
             | graphic card. If you also consider parts of the world like
             | Europe where the price of electricity more than doubled and
             | the power consumption of 4xxx series (practically secondary
             | room heaters), there are even less people here willing to
             | pay the price.
        
       | Melatonic wrote:
       | aka " We realized nobody was going to buy this or most of the
       | 4000 series cards "
       | 
       | The 4070 was supposed to be the cash grab card for when
       | everything sold out and desperate people would be willing to pay
       | for it
        
       | mkl95 wrote:
       | Is this post AI generated? The images are particularly weird
        
         | youainti wrote:
         | They look like phone images to me. At least phone images from
         | 2015 non-iphones.
        
       | spelunker wrote:
       | The name is bad, so we're unlaunching it? Is something else going
       | on?
        
         | aix1 wrote:
         | My reading is that they're implying it'll be launched but under
         | a different name.
        
       | Night_Thastus wrote:
       | No way they did this on their own. Retailers or partners must
       | have pushed back on it because they didn't want to deal with
       | upset customers and constant returns, or scams.
        
       | ShakataGaNai wrote:
       | These big companies really need to get naming input from someone
       | other than marketing teams. The second the 4080 and 4080 (not a
       | typo) got announced, Nvidia was shredded by the media. It was
       | immediately and obviously clear to basically everyone that this
       | was a bad naming system and only a bunch of navel gazers could
       | have thought it was "good".
       | 
       | I get that Engineers tend to be more practical in their names,
       | and don't have the finesse that marketing is looking for. But at
       | least some sanity checks would be good....
        
         | [deleted]
        
         | dinobones wrote:
         | I'm still mad about CUDA cores. I thought I was going to be
         | able to write some epic 1000x level parallelism running
         | individually on all cores.
         | 
         | It turns out, a CUDA core is not actually a "core."
        
           | Melatonic wrote:
           | Why do they call it that then? I never really looked into it
           | that much and just took as a measure of a certain type of
           | compute capability ( FP16 or FP64 right? )
        
             | wmf wrote:
             | The SIMT architecture makes it look to the programmer like
             | each FPU is a separate core, but all the cores in an SM
             | have to run in lockstep to get good performance.
        
             | monocasa wrote:
             | They use that metric because it makes their marketing specs
             | look nicer.
             | 
             | Ultimately it's a count of the number of SIMD lanes.
        
         | yamtaddle wrote:
         | There was a golden age in the '00s when it was possible to get
         | the gist of what Nvidia and ATI card names meant without
         | consulting a very dense table. It was nice.
        
           | systemvoltage wrote:
           | It was amazing actually. Intel's marketing was so
           | spectacular. Blue man group. Bunny suit commercials.
           | _Pentium_ , what a name. Intel Inside, those two words start
           | an uninitiated jingle in my head.
           | 
           | This is not looking through it with rose tinted glasses and
           | nostalgia. It was objectively better, fun, straightforward
           | and iconic. Not a single person knows what Intel's (or AMD,
           | nVidia, Apple, etc.)'s advertisements after 2000's. Do you
           | remember the last Apple ad? No. It is all generic, designer
           | bullshit.
           | 
           | All of it has gone to toilet. Marketing people have lost it
           | across the board.
        
             | blagie wrote:
             | I agree. I think a deeper problem is it takes a Ph.D in
             | Intel / AMD branding to understand what to buy. An 80486
             | was faster than an 80386, and 33MHz was slower than 66MHz.
             | It was simple.
             | 
             | Intel's i7 line-up goes from 2 to 16 cores, 1-4GHz,
             | spanning 13 generations. Toss in i3/i5/i7/i9, and lines
             | like Atom and Xeon.
             | 
             | Each time I need to upgrade my computer, I groan. It's not
             | just less fun, it's positively miserable.
             | 
             | Most people I know either buy the cheapest possible
             | computer, or an Apple. I don't know why Intel thinks anyone
             | will spend extra if they have no idea what they're buying.
             | Most non-Apple users I know have phones with faster
             | processor, higher-resolution displays, and for higher
             | prices than their laptops.
        
               | lbotos wrote:
               | Agree on the misery. I was speccing out a build and
               | inadvertently picked a 2019 processor because it was
               | extremely unclear.
               | 
               | (I'm now actually looking at an AMD 7700 rig, because
               | intel won't do ECC on "desktop" CPUs, except for a rare
               | chipset that I can't find a mobo for sale at the
               | moment...)
        
               | kevin_thibedeau wrote:
               | It's the Packard-Bell marketing strategy. Confuse the
               | marketplace with a profusion of similar models so that
               | comparison shopping can't be easily applied by casual
               | buyers.
        
               | tracker1 wrote:
               | The 13 generations is particularly bad, if you're just
               | trying to comment to someone looking for a used system,
               | when half the time they just list "Core i7" which is
               | meaningless without at least a model generation.
        
               | [deleted]
        
             | deathanatos wrote:
             | > _Do you remember the last Apple ad?_
             | 
             | Yes, and dear God am I sick of it. AFAICT, they've bought
             | all advertising space on the web, mobile, and TV for me, at
             | the moment. (It's the one of the iPhone auto-dialing 911 in
             | a wreck.)
             | 
             | I'm (still) not buying an iPhone.
        
             | wincy wrote:
             | The last Apple ad I remember was that song 1234 by Feist.
             | 
             | Which was... 2007.
        
             | bigmattystyles wrote:
             | Oh please, AMD CPUs had lower clocks so to compete with
             | Intel's (making up numbers to illustrate the point) 2.3Ghz
             | where theirs was 2.1Ghz, they would call it Athlon 2300 or
             | something to the effect. They may have had a point that
             | their 2.1Ghz was as good as Intel's 2.3Ghz chip, but it's
             | not been straightforward, probably, since a 286. (Edit, I
             | meant to reply to the parent comment)
        
               | systemvoltage wrote:
               | See Mac vs PC ads. Still memorable and impactful.
        
           | Merad wrote:
           | Are you maybe thinking of CPUs back when they were marketed
           | by clock speed? Because GPU naming has always been a mess. In
           | the mid 2000s for example you had the Nvidia Geforce 7 series
           | with product names such as: 7800 GS, 7800 GT, 7800 GTX, 7900
           | GS, 7900 GT, 7900 GTX, 7900 GTO, 7900 GX2. They've been
           | moderately consistent with "bigger numbers in the name =
           | higher end card" but beyond that you can't tell anything
           | meaningful without comparing the cards in a table.
        
             | userbinator wrote:
             | At least they didn't reuse the names... unlike e.g. the
             | _three_ variants of the  "GT730" they released.
        
         | reaperducer wrote:
         | _These big companies really need to get naming input from
         | someone other than marketing teams_
         | 
         | It's not the marketing teams to blame.
         | 
         | Marketing teams name things iPod, or MacBook, or PlayDate.
         | 
         | I don't know who names things at Intel, or Nvidia, or Sony, but
         | it's not the marketing team. At least not a good one.
        
           | ShakataGaNai wrote:
           | Clearly some departments of Sony have engineers naming
           | things. No marketing team would put out a product names the
           | "Sony WH-1000XM4" not to be confused with the "Sony
           | WF-1000XM4".
           | 
           | Overall Nvidia generally has a very good naming system. They
           | are easy to understand if you look at them for more than a
           | minute. Nvidia is 4090? 40 = Generation. 90 = Model. Higher
           | model # is better. They've stuck with the general concept for
           | the better part of 20 years.
           | 
           | Intel's naming is decent. Their cutsey names like Sandy
           | Bridge, meh. No one can never remember those. But the Core
           | numbering system is solid. i3 is lowest. i9 is highest. The
           | processor numbers after that can be a little hard and do
           | require a bit of a decorder matrix to understand. But as long
           | as it's a system, with rules, that they follow, and can be
           | explained fairly easily - I'm ok with it. Heck they have a
           | page that gives you the magic decoder ring: https://www.intel
           | .com/content/www/us/en/processors/processor...
        
             | arprocter wrote:
             | Acer's monitor naming looks like a cat walked on someone's
             | keyboard - KVbmiipruzx
        
               | muro wrote:
               | The best named monitor recently is probably the Waysus
               | 22.
        
             | Macha wrote:
             | I think you need to at least also consider the generation
             | along with the bucket for Intel CPUs. For most users a 12th
             | gen i3 is better than a ninth gen anything, yet plenty of
             | retailers kept old laptop skus around long enough you would
             | see both side by side at a retailer
        
             | izacus wrote:
             | I'll be honestly concerned for Sony if they stop naming
             | their products from random bash line noise.
             | 
             | It's now as much part of their brand as all other things
             | are.
        
             | eastbound wrote:
             | Sony's naming problem is not because of engineers; It's
             | clearly the marketing team, and the goal is most certainly
             | to make this incomparable, across continents, across years,
             | or between the one that was given to the
             | reviewers/journals/comparators and the ones that the
             | customers can actually purchase.
             | 
             | Sony's problem is that they try to sell bad products for
             | the price of expensive ones, and the best way to do that is
             | to have incomprehensible names.
        
               | ShroudedNight wrote:
               | The MDR-7506 has just as obscure a name as anything
               | they're selling; it's not clear to me that the naming is
               | so much a strategy as a lack thereof...
        
               | Someone wrote:
               | I think the goal is more so that big chains can sell
               | model numbers that nobody else sells, making it risk-free
               | for them to promise "we'll match any cheaper price".
        
             | mey wrote:
             | Except Nvidia uses the same branding for it's mobile and
             | desktop chips, and in the past have rebadged different
             | architectures under multiple gen numbers. (GT
             | 510/520/605/610/710 all the GF119 chip)
             | 
             | It's all pretty bad.
        
             | gigaflop wrote:
             | For what it's worth, I bought a high end TV recently, the
             | Sony Bravia A90J. I've left out some of the full product
             | name, but this info is all you need if you care to look up
             | that TV.
             | 
             | When I was looking in physical stores, at physical devices,
             | I noticed that there were important differences between the
             | [A-Z][8-9]0[A-Z]s, when I would research the model numbers
             | online. 80 vs 90 indicated jumps in overall quality,
             | depending on the other letters in the model name, which
             | usually meant that the product was created specifically for
             | the store (like Best Buy vs Costco vs buying direct), and
             | would have other minor differences from the 'true' version.
             | 
             | A regular person would have probably just looked at the TVs
             | in-store and decided based on whatever looked best, but I
             | happened to have some specific features I wanted, and the
             | weird-ass model names helped.
        
               | solarkraft wrote:
               | TV naming is especially crazy. They have variants for
               | everything from geopraphical location to specific sales
               | events.
               | 
               | My TV lacks the ability to transmit audio via Bluetooth
               | (no, I can't enable it, I think it actually lacks the
               | module). Nobody could have told me that before I bought
               | it, the marketing material and manuals all claim that it
               | has it. There is precisely NO documentation for my
               | specific model.
               | 
               | I'm starting to think that they're actively counting on
               | people not _completely_ testing their devices after
               | getting them.
        
               | Melatonic wrote:
               | The A90J is the top model right? Was looking at those
               | myself recently. Amazon warehouse occasionally has a
               | cheap deal on one but I am always scared those probably
               | have dead pixels.
               | 
               | I really wanted a Panasonic Plasma but it looks like the
               | sole importer may not be getting them anymore or might be
               | getting less. But from what I understand the A90J and the
               | top end Panasonics are the best in that they have a much
               | better heatsink
        
               | gigaflop wrote:
               | A90J is, by the research I did and the word of the person
               | who sold it to me (a family friend, has owned a TV
               | business for 25 years, and gave me his at-cost price),
               | the best. I absolutely love it. And yes, the panel +
               | heatsink are top notch. Some other models/brands use the
               | same panel, but lack the stronger heatsinks, and aren't
               | able to utilize it as best as possible.
               | 
               | It runs Android TV, which may or may not be a dealbreaker
               | for you, but I enjoy it enough. I just wanted to be free
               | of a vendor-specific TV os, in order to give myself more
               | flexibility when I try to set up a pi-hole in the future.
               | There's also a hardware switch to disable the TV's
               | microphone.
               | 
               | Also, the sound comes out from the panel itself, and is
               | (to me) great. It calibrates itself using the microphone
               | within the remote, by having you hold it a certain way
               | when performing setup.
               | 
               | Finally, there's an incredibly posh and satisfying
               | 'click' noise when you turn it off. I don't know why, but
               | this makes me like the TV more.
        
           | bee_rider wrote:
           | It is weird because Nvidia clearly has an instinct to give
           | their cards car names (with the GTX, GT, RTX, etc etc stuff).
           | They should just get rid of the numbers for the most part.
           | 
           | 4090 -> 2022 Nvidia Optium
           | 
           | 4080 -> 2022 Nvidia Melium
           | 
           | 4070 -> 2022 Nvidia Bonum
           | 
           | 4060 -> 2022 Nvidia Benem
           | 
           | (I barand-name-ified the latin words for
           | best/better/good/okay).
        
             | mosen wrote:
             | Without the numbers, you have no idea which one's better at
             | a glance (which is why they're retracting the "other"
             | 4080).
             | 
             | And "Bonum"... are you sure?
        
               | bee_rider wrote:
               | The problem with the numbers is that we expect them to
               | have some meaning. There's no inherent ordering between
               | maxima/altima/sentra but if you are shopping for Nissan
               | cars you figure it out. If you are spending a couple
               | thousand dollars on something you shouldn't pick at a
               | glance, you should look at the specs.
               | 
               | Bonum -- apparently that's the latin word for good? I
               | dunno I just dropped words into google translate and then
               | hacked off letters at random to fit the pattern. I'm sure
               | they can come up with better fake words.
        
           | rjmunro wrote:
           | Marketing teams name things 360, One, One S, One X, Series S
           | and Series X. I think that's the right order, I'm not sure.
        
             | TillE wrote:
             | When the Xbox One was announced, people complained that it
             | was confusing, but really it had been long enough since the
             | original Xbox that the name was just silly, not confusing.
             | 
             | The One/Series S/X crap is genuinely baffling, totally
             | incomprehensible unless you've really been keeping up with
             | every Xbox release. You can go on Wikipedia and figure it
             | out in a few minutes, but...you should not have to do that.
        
               | monkpit wrote:
               | Don't forget the OG offender in this category, the PSOne
               | / PSX
        
               | kmeisthax wrote:
               | In Sony's defense, everything else with the PlayStation
               | was actually pretty straightforward. PS1, PS2, PS3, PS4,
               | and PS5.
               | 
               | "PSOne" was a weird way to brand a slim console, but it's
               | still obvious that it's a PS1. And while Sony _did_
               | originally use PSX to refer to the PS1, that was an
               | internal codename, i.e.  "different from the Nintendo
               | PlayStation[0]". The gaming press ran with it because
               | people in that era insisted on awkward three-letter
               | acronyms for all games consoles. Reusing it for a weird
               | PS2 DVR combo unit is still way better than Microsoft
               | launching two _different consoles_ with the same name.
               | 
               | [0] The cancelled SNES variant with the also-cancelled
               | Super CD add-on built-in, both built by Sony.
        
             | Dylan16807 wrote:
             | The order is 360, One, Series.
             | 
             | The letter is just tier.
             | 
             | It's remarkable how thoroughly they managed to outdo the
             | confusing nature of "One". Who would look at "Xbox Series"
             | and think that's the name of a _specific generation_? It 's
             | an artistic masterpiece.
        
           | dingaling wrote:
           | What is an iPod without context? Some sort of protective case
           | with an RFID tag maybe?
           | 
           | PlayDate - is that a video conferencing product?
           | 
           | These are not good product names
        
             | ShakataGaNai wrote:
             | Context is required for basically all product names, unless
             | they've managed to make themselves generic. Ex
             | https://www.businessinsider.com/google-taser-xerox-brand-
             | nam... . Even then, if they are "generic" they still often
             | require context of a specific country or language.
             | 
             | If I ask you about a Mustang, what do you think about
             | first? Are you into cars and it's a Ford Mustang? Are you
             | into Horses? Are you into Planes? Or maybe you're into
             | ships? Heck, there is an entire list of options:
             | https://en.wikipedia.org/wiki/Mustang_(disambiguation)
             | 
             | A good name is memorable, not necessarily descriptive. Most
             | product and company names today are made up anyways. Or
             | they are named after something else in a completely
             | arbitrary fashion.
             | 
             | The problem comes when a company establishes a name for one
             | thing, then uses it for another. The iPhone is a good name
             | in concept. Pro/Max/Ultra/Mini not withstanding. But what
             | if tomorrow Apple said there was an iPhone Super Ultra Max
             | that was 10" and couldn't make calls. People would argue
             | that was an iPad and that this new Super Ultra Max was a
             | stupid name.
        
         | sulcate wrote:
         | But that's not the point. It's not meant to be intelligible.
         | The point is marketing, aka to misinform consumers. It's
         | working as expected and it happens in every field.
         | 
         | Choosing obscure names that make it extremely hard to compare
         | characteristics within products by a company, much less to
         | compare to outside competitors, is not a bug --- it's a
         | feature.
         | 
         | Try buying a bike and figuring out how to compare it to other
         | bikes by the same manufacturer from this year or last, or try
         | to figure out what features it carries. You're left doing what
         | you always do: staring at 7 tabs with spec sheets and slowly
         | trying to absorb the features of the various "poorly" named
         | offerings
         | 
         | It's anti consumer and I'm surprised there's not more outrage,
         | given that a market purportedly should consist of rational
         | consumers making informed decisions.
        
           | jamiek88 wrote:
           | >given that a market purportedly should consist of rational
           | consumers making informed decisions.
           | 
           | And that misconception of humans by economists has had
           | massive repercussions.
           | 
           | No human is rational.
           | 
           | We are emotion machines riding hormone waves. Fatigue,
           | hunger, anger, arousal all affect our choices and can be
           | gamed.
        
         | userbinator wrote:
         | _I get that Engineers tend to be more practical in their names,
         | and don 't have the finesse that marketing is looking for._
         | 
         | I thought they could've just used the codenames, or whatever is
         | actually written on the GPU IC itself...
         | 
         | https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_proces...
         | 
         | ...but then you realise they called the "12GB 4080" an
         | AD104-400, and the "16GB 4080" an AD103-300, while the 4090 got
         | named the AD102-300.
        
         | wongarsu wrote:
         | Yeah, even if they had a good reason not to call one the 4070,
         | the whole thing could still have been avoided by just calling
         | them the 4085 and 4080. And the marketing people could probably
         | have come up with something even cooler sounding, if somebody
         | would have just stopped them from going with 4080 12GB and 4080
         | 16GB.
        
           | FractalParadigm wrote:
           | The funny thing is Nvidia already has 2 sub-part-numbers for
           | better-than-the-xxx0-cards, without creating another line of
           | xxx5 products. The 16GB could have been branded 4080 Ti or
           | 4080 Super with the 12GB being the 'base' 4080.
        
             | tracker1 wrote:
             | That was my thought... they should have just called it a
             | "Super" still leaving room for a Ti model later. Or bring
             | back GS designation after... 4080 and 4080 GS. They had
             | lots of options to add distinction.
        
         | CivBase wrote:
         | I'd be _shocked_ if the original names were engineering
         | decisions. Seems blatently obvious that marketing just re-
         | badged the 4070 at the last minute and it backfired.
        
         | aprdm wrote:
         | To their credit, they did roll it back
        
       | SevenNation wrote:
       | I read this release 3 times and still don't know what's
       | happening. What does "pressing the unlaunch button" mean?
       | Discontinuation? Rebranding to RTX 4080? What?
        
         | karamanolev wrote:
         | There will not be a 4080 12GB card with the specs it was
         | announced with. Basically, "pressing the unlaunch button" is
         | exactly the opposite of doing the announcement. An attempt at
         | "we take back what we said, imagine that nothing happened".
        
       | ac29 wrote:
       | For those not in the know the "4080 12GB" as compared to the 4080
       | 16GB was not just the same card with a little less RAM, as you
       | might assume from the name. It also had ~20% fewer GPU cores and
       | was significantly slower for that reason.
        
         | newsclues wrote:
         | And slower bus speed for the memory
        
         | ErneX wrote:
         | Less memory bandwidth too IIRC.
        
           | PartiallyTyped wrote:
           | 192bit bus in 2023 on an 80 "class" GPU of all places.
        
             | MikusR wrote:
             | It's not like their greatly increased cache.
        
             | westmeal wrote:
             | yeah seriously what were they thinking
        
               | ChuckNorris89 wrote:
               | $$$
        
         | timbo1642 wrote:
        
         | cush wrote:
         | Good on Nvidia for owning up to and quickly fixing the
         | marketing mistake
        
         | mynameisvlad wrote:
         | Did they just think it was too good to be a 70 level card? It's
         | not like they had a 4070 and a 4070Ti and a Super 4070 already
         | and had to figure out another way to market it.
         | 
         | I just don't see the logic in the naming in any way shape or
         | form.
        
           | Spartan-S63 wrote:
           | Their price point was too high to be considered a 4070. If
           | they launched it as a 4070, it would be naked price gouging.
        
             | [deleted]
        
             | izacus wrote:
             | I think you're setting yourself for another massive
             | disappointment if you think nVidia will sell those at any
             | kind of lower price in the future.
             | 
             | Because that's the issue isn't it? Everyone wanting for
             | nVidia to just sell those chips for less?
             | 
             | They'll slap 4070Ti or something on it and sell it to you
             | for the same price.
        
               | Macha wrote:
               | You might have missed it with how long the inflated
               | market lasted, but in the last few months with the
               | ethereum PoS transition and the threat of recession
               | keeping gamers at bay they literally haven't been able to
               | sell the 30-series cards for quite a bit under MSRP in
               | what was a very sudden reversal from being above MSRP.
               | Maybe a new series stole the demand but part of the
               | reason they're holding out on lower tier SKUs is the
               | 30-series is not going away fast enough, so there's a
               | good chance that we're back in the 10-series vs 20-series
               | day where there's so much used stock of the old series
               | and a perception that nvidia is gouging to the extent
               | that it hamepers sales of the new ones.
               | 
               | 30 series launch MSRPs looked very good until it became
               | clear you couldn't get them for that price, partly
               | because they were having to partially roll back the 20
               | series price increases, so it wouldn't be impossible to
               | see a repeat.
        
               | neogodless wrote:
               | You can get an RTX 3090 (MSRP $1499) for about $900-950
               | now. (In other words, the prices on these cards really
               | should come down a bit.)
               | 
               | So while the graphics card formerly known as RTX 4080
               | 12GB might perform about the same as the RTX 3090, it's
               | no better value.
               | 
               | The issue isn't the pricing, even though that's an issue.
               | 
               | The issue is what looks like an intentional attempt to
               | confuse consumers by selling two products with roughly
               | the same name, but one of them has 25% greater
               | performance. Some consumers may think "I'm coming from a
               | 6GB card, 12GB is plenty, I don't need to pay more for
               | the 16GB one, I'll get almost the same performance"
               | without knowing the technical details.
               | 
               | Ideally they'll still read/watch reviews and get the best
               | product for their budget, but that doesn't excuse
               | misleading names.
        
               | icambron wrote:
               | This was me. I was too busy to read/watch anything, 12gb
               | seemed right in the middle of the range, and I blindly
               | pulled the trigger. Later found out I wasn't quite buying
               | what I thought I was. It's shitty naming and I'm glad
               | they're fixing it.
        
             | imiric wrote:
             | > it would be naked price gouging
             | 
             | As opposed to the rest of the 40 series cards?
        
               | PaulHoule wrote:
               | If you divide the cost of the card by the pixel fill
               | rate, the 4090 is a bargain.
        
               | onli wrote:
               | It is a new series. It is supposed to be cheaper relative
               | to the performance than the ancestor series, not priced
               | as a higher tier.
        
               | plasticchris wrote:
               | This is a fundamental shift from years past where perf
               | per dollar scaled up every generation. Now they are
               | trying to scale performance while also scaling up price.
               | Only time will tell if the demand is inelastic or not.
        
               | neogodless wrote:
               | The RTX 3090 had an MSRP of $1499, so this is only a 6.7%
               | increase. If there was a fundamental shift, it was when
               | that card launched.
        
               | Dylan16807 wrote:
               | 3070 had .29 GP/s/$ at base clock.
               | 
               | 4090 has .28 GP/s/$ at base clock.
               | 
               | Considering this is coming out two years later, I'm not
               | so inclined to call it a bargain.
        
               | filmgirlcw wrote:
               | Meh, only if you accept that the 4090 is really a Titan
               | or a Quadro under another name. And to be fair, I think
               | that it probably is. But that doesn't match with the
               | consumer designation that we see in these things and so
               | bargain or not, this is a professional card being
               | marketed to consumers.
               | 
               | If you're looking for a pro card and have the cooling and
               | power to support it (not to mention, the workflow needs
               | that could benefit), that's great. If you're a gamer or
               | enthusiast, the price is still high enough (not to
               | mention the other changes you might need to make to your
               | rig to support the card) that the actual delta between
               | the potential and what you'll actually do with the card
               | means you should probably just stick with either a 3090
               | that is now half the price, or hold out for the other
               | 4000 series cards if you must get a next-gen card.
        
               | Retric wrote:
               | The 4090 can't actually play Cyberpunk in 4k at max
               | setting and RT without stuttering. People are seeing
               | 22-30FPS walking around.
               | 
               | There is a lot of confusion because some people assume
               | turning on DLSS increases settings but it actually lowers
               | quality. Sure DLSS is good enough most don't notice, but
               | you can say that about lowering most settings slightly to
               | improve performance.
        
               | Godel_unicode wrote:
               | This is incorrect. If you want to play e.g. Cyberpunk at
               | 4k maxed out with decent framerates the 4090 is the only
               | card that gets you there. Especially once you start
               | looking at meaningful numbers like 1% lows, even the 3090
               | is struggling to break the low 30s.
               | 
               | It doesn't apply to every game or every resolution, but
               | there are actual game scenarios where a 4090 makes sense.
               | Price and heat and power and space not withstanding, it's
               | a meaningful upgrade.
        
               | filmgirlcw wrote:
               | Look, if you want to pay $1600 to play one game at 4K on
               | max settings, be my guest. I'm fine with it in 1440p or
               | in 4K at not max settings, but you do you.
        
               | flerchin wrote:
               | Dangit, I do want to play Cyberpunk maxed in 4k. Sorry
               | kids, I guess you'll have to take a gap year before you
               | can go to college, daddy's gonna upgrade.
        
               | cercatrova wrote:
               | College isn't that cheap unfortunately.
        
               | Wowfunhappy wrote:
               | I will add that, IMO, if it's possible for _any_ gpu to
               | play a just-released game at maximum settings at a high
               | resolution at 60+ fps, the developers haven 't set the
               | maximum quality settings high enough. Leave some headroom
               | for future GPUs, or allow players who e.g. prefer a
               | ludicrously high draw distance at any cost to make that
               | choice and dial down the resolution to compensate.
               | 
               | Most games don't meet this bar, and I think gamers who
               | expect to be able to set every slider to the max and go
               | to town--yes, even on a $2K GPU--are mostly responsible.
        
               | Godel_unicode wrote:
               | I disagree that developers should spend valuable
               | engineering time on producing games that can't be run.
               | Spend that time making other games instead, or squashing
               | bugs (looking at you, cyberpunk!) and maybe keep a
               | backlog of future features to patch in when the hardware
               | gets there.
        
               | autoexec wrote:
               | I agree! When it comes to graphics, just make them look
               | great with current (and not uncommon) hardware. When most
               | gamers have upgraded hardware capable of substantial
               | differences games can keep selling remasters if people
               | care.
               | 
               | I'm fine with nearly all gamers getting a better
               | experience even if that means the tiny fraction of gamers
               | who can and are willing to spend insane amounts of money
               | on the best of all possible video cards are not able to
               | take full advantage of their crazy hardware in most
               | games.
        
               | jandrese wrote:
               | There is a limited set of knobs a developer can add
               | without increasing their development costs. If you ship a
               | set of "ultra mega extreme" textures that will only be
               | usable with future hardware you are still bloating the
               | download by many gigabytes, probably dozens or even
               | hundreds. If your dev team says they can make even better
               | shadows but not on today's hardware then is it really
               | worth the development effort to create them now? You can
               | multiply particle effects to crazy amounts but that ends
               | up looking silly.
        
               | Wowfunhappy wrote:
               | To be clear, I'm just asking for extreme draw distances,
               | higher resolution shadows, and full quality textures. If
               | that would require significant engineering, I stand
               | corrected! I can certainly see how install size would be
               | a concern with textures.
               | 
               | I find it difficult to return to games such as Assassin's
               | Creed II because of their muddy textures and low draw
               | distances. These issues feel like something that could
               | have been avoided with just a tad more forward thinking!
               | 
               | There are also games like Quantum Break which (at least
               | at launch, not sure if it was ever fixed) included
               | mandatory upsampling which the user couldn't disable. The
               | reason given was that the game wasn't designed to be
               | playable at native resolution, and no current hardware
               | would be able to run it that way.
        
               | ynx wrote:
               | Yes.
               | 
               | Extreme draw distances: large open-world games or those
               | with very long, content-intensive environments need to
               | resort to tricks to unload parts of the world that are
               | not visible or quickly accessible. Extreme draw distance
               | can mean keeping orderS of maginitude more objects
               | resident, which could mean a lot more materials loaded,
               | more draw calls, more VRAM usage, or more swap.
               | 
               | Higher resolution shadows: Hard shadows tend to look bad,
               | soft shadows tend to perform bad, and worse with more
               | lights. It takes a lot of deep GPU knowledge to do these
               | in a visually convincing and high quality manner. The
               | difference between "good enough" and "perfect" will
               | easily cost you double digit fps at a minimum.
               | 
               | Full quality textures: As with the draw distance caveat,
               | implementing LODs is rather work-intensive. Some people
               | will tell you that you can automate it, and they're half-
               | right. If you are looking for top-notch game quality,
               | that absolutely does not cut it, but if you're not trying
               | to go the extra mile it can be serviceable.
               | 
               | Games are super inconsistent in how far they push
               | technology vs push the art, but there is rarely a "turn
               | the dial to 11" knob ready to turn. The production
               | requirements and technical limitations mix in
               | unpredictable ways.
               | 
               | Other times, games push ridiculously far in certain
               | directions that later become mainstream, and execute well
               | enough that, after they are copied into other mainstream
               | games, they feel deficient - not in spite of their
               | success, but as a direct result of it!!
        
               | int_19h wrote:
               | > Extreme draw distance can mean keeping orderS of
               | maginitude more objects resident, which could mean a lot
               | more materials loaded, more draw calls, more VRAM usage,
               | or more swap.
               | 
               | The point is that all of this merely requires more
               | resources at runtime, not any additional work on behalf
               | of the developer. So, by allowing limits higher than is
               | practical on hardware at the time of release, the game
               | can scale somewhat better on future hardware. What's the
               | downside?
               | 
               | High-res textures are a different thing, since they
               | actually have to be painted. Or upscaled, I suppose, but
               | that's still code somebody has to write.
        
               | Wowfunhappy wrote:
               | > High-res textures are a different thing, since they
               | actually have to be painted.
               | 
               | Ah, I want to clarify again--I was imagining the
               | developers already had higher quality original textures,
               | which they had downscaled for release. The textures in
               | Assassin's Creed II, for instance, have the look of
               | images that were saved down to a lower resolution from
               | larger originals. But I could be wrong, or even if I'm
               | not, it might be less common nowadays.
               | 
               | As you say, the goal is to include things that only
               | require computational resources at runtime (even an order
               | of magnitude more).
        
           | LegitShady wrote:
           | they got greedy and every reviewer in existence called them
           | out on it. The price is too high for a 4070 so they called it
           | another 4080 no matter how different it was from the other
           | 4080.
        
             | samstave wrote:
             | What the heck was the price?
             | 
             | I paid $1,600 for an evans and Sutherland card with 32
             | MEGABYTES to run Softimage on NT 4 because I couldn't
             | afford an SGI in 1997
        
               | goosedragons wrote:
               | $900. But there was a time when the best GeForce you
               | could get was less than that by a pretty wide margin.
        
             | chippiewill wrote:
             | They should have just called it a 4075
        
               | Macha wrote:
               | It probably will be. A 4075, 4080-lite, 4070 super or
               | similar. I don't think their pride and target retail
               | price could stomach calling it the 4070 it clearly is.
        
           | filmgirlcw wrote:
           | They thought they could fool people and that the graphics
           | card demand of the last two years would let them coast on
           | this. They were wrong.
        
           | 0x457 wrote:
           | I think what they saw is: we're going to release 4080 and not
           | 4080 under 4080 name, that would allow us to claim that 4080
           | stayed the same MSRP.
           | 
           | Except anyone with a brain saw through it and enough content
           | generated, that trick simply wouldn't work anymore.
        
           | elabajaba wrote:
           | The 4080 12GB's performance is about the same as where a new
           | 60 series card sits compared to the previous gen 80 series
           | card (60 series tend to be within +-10% of the previous gen
           | 80 series, 4080 12GB is at most a 60ti based on this)
           | 
           | The 4080 16GB is around where a 70 series card tends to sit.
           | 
           | [0] https://www.overclock3d.net/news/gpu_displays/nvidia_rele
           | ase... Ignore the dlss2 vs dlss3 numbers, just look at the
           | base numbers since dlss3 numbers are interpolated framerates
        
           | mooman219 wrote:
           | Some reddit post looked at the % differences in core count
           | and clock speed relative to each generation. It most closely
           | fit in the spot a 4060Ti would fit in based its specs
           | relative to the actual 4080.
        
       | hankman86 wrote:
       | It's smart how they make a backflip on the 4080 12GB naming to
       | distract from the price hike. That's playing 3D marketing chess
       | ...not.
        
       | bryanlarsen wrote:
       | Based on the pricing, they unlaunched the wrong one. The price of
       | the 16GB 4080 is way higher than the 3080. Based on pricing
       | alone, it's the 12GB card that should be called the 4080 and the
       | 16GB card called a 4080Ti or Super or 4085 or something.
       | 
       | So hopefully this is a sign that they're going to adjust the
       | pricing of the 16GB 4080 to the price announced for the 12GB
       | former-4080.
       | 
       | I doubt it, but one can hope.
        
         | sleepymoose wrote:
         | In this case, I'm not sure that hope is really even worth it.
         | This whole launch has been blatantly artificially inflated to
         | move more of the backstock on the 30 series.
        
         | izacus wrote:
         | I strongly doubt it.
         | 
         | What I bet will happen is: 4080 16G is going to be a very fast
         | card that's going to sell as hot cakes at set price point. Just
         | like the 4090 is selling very well despite all the moaning
         | about power and price.
        
       | bombcar wrote:
       | USB Consortium: We will make the most customer-confusing naming
       | system ever.
       | 
       | NVidia: Hold my beer.
        
       | josmala wrote:
       | 4080 12GB did its task. It send the gamers a message that do not
       | wait for cheaper 40 series cards go by 30 series now. Now the
       | best thing Nvidia can do is wait until navi 3 has launched and
       | release 4070 ti with same specs but decide price point based on
       | what AMD has set to their cards.
        
         | nomel wrote:
         | > It send the gamers a message that do not wait for cheaper 40
         | series cards go by 30 series now
         | 
         | Jensen mentioned this with the Q2 earning statements [1][2],
         | that pricing would be set to sell _old_ (30 series) inventory:
         | 
         | > And so our first strategy is to reduce sell-in in the next
         | couple of quarters to correct channel inventory. We've also
         | instituted programs to price-position our current products to
         | prepare for next-generation products.
         | 
         | > ... we've implemented programs with our partners to price-
         | position the products in the channel in preparation for our
         | next generation.
         | 
         | 1. See JayzTwoCents video about this:
         | https://www.youtube.com/watch?v=15FX4pez1dw
         | 
         | 2. Q2 Transcript: https://www.fool.com/earnings/call-
         | transcripts/2022/08/24/nv...
        
           | sliken wrote:
           | Indeed, apparently Nvidia misjudged the GPU forecast and the
           | crash of crypto mining on GPUs and ended up with a ton of
           | stock. Apparently AMD was more accurate, so here's hoping AMD
           | beats Nvidia to the punch at the $300, $400, and $600 price
           | points.
        
       | datacruncher01 wrote:
       | Watching that stock price circle like a turd in the bowl.
        
       | shmerl wrote:
       | I'm going to buy AMD RDNA 3 card anyway.
        
         | tracker1 wrote:
         | I'm leaning that way myself... will see where it lands... if it
         | can get RTX performance close to 3090 or so will probably go
         | that direction... I'm not giving up 4 slots for a video card,
         | even if the boards just have m.2 there, it just feels so wrong.
        
       | Rapzid wrote:
       | I was thinking to myself the other week "It's only a few dollars
       | more for the 16GB, why does the 12GB even exist?".
       | 
       | Was under the impression that perhaps they had changed direction
       | late in the game and had to offload those other cards.. Or maybe
       | they lacked the political will to "unlaunch" it earlier.
        
       | nottorp wrote:
       | Hmm it looks like my idea of buying an AMD G series CPU with
       | integrated graphics so I can wait out the current video card
       | market was a stroke of genius.
        
       | bonney_io wrote:
       | Why not just call it the 4075, if its really sort of between the
       | 4070 and 4080 in terms of price and performance?
        
       | mnd999 wrote:
       | I'm really hoping the new AMD cards are good because I think I'm
       | done with Nvidia.
        
       | dymk wrote:
       | I think it's just as likely that somebody hacked the NVIDIA blog
       | and made this post, as it is that the blog post is authentic.
        
       | jstummbillig wrote:
        
       | GiorgioG wrote:
       | NVIDIA wanted to sell the "12gb 4080" _cough_ 4070 for  "X080"
       | prices. People yelled bullshit. End of story.
        
       | cercatrova wrote:
       | A perfect time to re-release it as the 4070, but retain the 900
       | dollar price tag.
        
       | [deleted]
        
       | beebeepka wrote:
       | From the company that brought us 970 4gb which had 3.5gb.
       | 
       | Thing is, they know there's plenty of suckers out there and will
       | absolutely not call, not to mention price, it as 4060 which it
       | clearly is.
       | 
       | Help us, Lisa, you're our only hope. Not that AMD didn't
       | overprice the AM5 platform, too. The only way is to resist. Just
       | wait it out if you can help it
        
       | gl-prod wrote:
       | * 4070
        
       | causi wrote:
       | Never forget that the inflation-adjusted launch price of the GTX
       | 1080 is $740 and don't you dare let somebody tell you $900 or
       | $1200 is "just inflation adjustment".
        
         | Rebelgecko wrote:
         | I suspect they're trying to maximize profits before the 25% GPU
         | tariff comes back. Then they can get good PR for keeping their
         | GPUs at the same overly inflated price
        
       | neogodless wrote:
       | https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-...
       | 
       | History... the GTX 1070 was a pinch better than the previous
       | generation GTX 980 Ti; cost ~$429 (MSRP $379). The GTX 1080 was
       | maybe 20-25% faster still, ~$699 (MSRP $599).
       | 
       | The RTX 2080 was launched as the top card (September 2018), with
       | the Ti and Super coming later.
       | 
       | It wasn't until last generation (September 2020) where Nvidia
       | introduced the x090 naming, and with an eye-watering $1499 price.
       | The initial 10GB RTX 3080 had an MSRP of $699.
        
         | angulardragon03 wrote:
         | The 3090 heralds the return of "extreme" SKUs in the line up
         | that was introduced with the 490 until the 690. There was no
         | 790, but I would consider the Titan/Titan Z SKUs to have picked
         | up that segment.
        
       | JudasGoat wrote:
       | Although there have been a lot of negative reviews regarding the
       | value of the 12gb 4080. I wonder if it could be more of a
       | response to what AMD is planning to release shortly? I don't
       | recall Nvidia backtracking in the past because of bad press.
        
       | intsunny wrote:
       | The amount of negative press around the 4080 12GB was not
       | insignificant:
       | 
       | https://www.youtube.com/watch?v=F7TRFK3lCOQ
       | 
       | 1.4 million views in three weeks. And that is just a single
       | Youtuber. Imagine all of them combined.
        
         | latchkey wrote:
         | All press is good press.
        
           | sliken wrote:
           | Heh, for some things.
           | 
           | Tons of reviews saying $900 for the 4080 12GB is insane. The
           | 4080 12GB is after all a 3060 update with 192 bit wide memory
           | interface. Even the 3060 Ti has a 256 bit wide memory
           | interface.
           | 
           | Definitely had me decide to wait to see how AMD does in Nov.
        
             | latchkey wrote:
             | It gets more people talking about NVIDIA. That is worth
             | more than you waiting to see what AMD does.
        
               | sliken wrote:
               | Not going to agree on that one, the message was generally
               | Nvidia is screwing gamers, but ANYTHING else.
               | 
               | Much like the general news of AMD server chips, tons of
               | press on Intel's shrinking Xeon marketshare. Sure people
               | are talking about Intel, as a lesson on what not to do.
        
           | 1123581321 wrote:
           | This is sort of true in this case. People who don't follow
           | GPUs closely, like me, now know that the 4090 is an appealing
           | card and the remaining 4080 deserves the name.
        
       | jamesfmilne wrote:
       | I just wish they wouldn't call the pro products RTX 6000 Ada
       | Lovelace.
       | 
       | Just call it the RTX L6000.
       | 
       | Maybe the think L is for Losers?
        
       | nsxwolf wrote:
       | I was expecting the article to announce it was being renamed to a
       | 4070 or something. Now I'm just confused. It's just going away?
       | Did they already manufacture these?
        
       | wnevets wrote:
       | Good, the whole thing was just stupid. It was a completely
       | different tier of product.
        
       | Kirby64 wrote:
       | It's not just that the 12 GB 4080 was a confusing name - it
       | wasn't the same class of card, at all.
       | 
       | In previous generations when they've had differing memory sizes
       | for the same card, that was the ONLY change. So, it was useful
       | for something like CUDA, but usually not for gaming. A specific
       | audience.
       | 
       | For the 4080, the 12GB version has the following changes:
       | 
       | * 12GB VRAM vs. 16GB VRAM (the obvious one from the name)
       | 
       | * 7,680 CUDA cores vs. 9,728 CUDA cores for the 16 GB.
       | 
       | * 192-bit memory bus vs. 256-bit memory bus (understandable,
       | since this scales with memory size... but also probably means the
       | memory itself is slower).
       | 
       | This isn't just a different amount of memory, it is fundamentally
       | a different product and should be marketed as such. Instead it's
       | Nvidia being greedy.
        
         | goosedragons wrote:
         | That's not quite true. They've pulled this stunt a few times.
         | Most recently with the 6GB and 3GB versions of the 1060. That
         | was arguably even worse because they did it after the launch.
        
           | MarioMan wrote:
           | I didn't realize it was this bad, but Wikipedia lists 7
           | different GTX 1060 chip revisions with 3, 5, and 6 GB
           | variants.
           | 
           | https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_proces.
           | ..
        
           | Kirby64 wrote:
           | The 1060 didn't really have significant differences in
           | performance, as far as I can tell. Still bad to do it, but
           | GPU benchmarks appear to put performance hit at ~3% less for
           | the 3GB version.
           | 
           | Looks like there's also a 5GB version which has slower
           | clocks... which is about 10% worse... but I assume that's
           | mainly due to clock rate, not the memory or the actual
           | silicon.
           | 
           | Those all have the same amount of CUDA cores and processing
           | pipeline, though. Unlike this '4080 12GB'.
           | 
           | See: https://www.videocardbenchmark.net/compare/GeForce-
           | GTX-1060-...
        
         | salawat wrote:
         | Another GTX 970-esque abomination sounds like.
        
           | izacus wrote:
           | What?
        
             | ohgodplsno wrote:
             | The GTX970 is a fun bit of hardware. It's marketed as a 4GB
             | RAM card.
             | 
             | In practice, it's 3.5GB of normal GDDR4, and 512 MB of
             | horribly slow, would have been considered bad in 2000 RAM.
             | So, people who bought it thinking they're getting a less
             | powerful GTX980 get more inferior product than they bought.
             | The last 512Mb is truly worthless, only good for storing a
             | desktop framebuffer. Anything that needs to write into it
             | quickly (like, say, literally any game) will just slow down
             | to a crawl.
        
               | ElectricalUnion wrote:
               | > The last 512Mb is truly worthless, only good for
               | storing a desktop framebuffer.
               | 
               | Not even good for that, even just reading the slow VRAM
               | slows down the entire GPU, so you never want to use the
               | last 512MiB.
        
           | wmf wrote:
           | The 970 was one of the greatest _gaming_ cards of all time.
        
             | Matthias247 wrote:
             | I seriously enjoyed mine and didn't understand what the
             | fuzz was about. Sure - it might have had less memory than
             | people expected - but in the end the game performance for
             | the price point mattered for me. And that was great. I got
             | a card which delivered great performance at that point in
             | time (certainly similar to what a 4080 12GB is for todays
             | generation of games) for less than 350EUR.
        
             | Melatonic wrote:
             | Not true at all - it seemed like a great deal at the time
             | but aged very poorly. You could run a 980 today and still
             | be doing great. Everyone I know with a 970 is struggling
             | 
             | At the time you would have been much better off going for a
             | 780 as they had some bargains - especially on the more rare
             | 780 with lots of ram
        
               | IshKebab wrote:
               | I still have a 970 and it's doing fine. 60 FPS 4K Rocket
               | League. I dunno what games and settings you guys are
               | using but I'm sure it could handle something more
               | demanding if I just turn the settings down.
               | 
               | I will never upgrade!
        
               | tracker1 wrote:
               | Enjoy Cyberpunk at 12fps.
        
       | somehnacct3757 wrote:
       | Still can't get over the power draw for the 40 series. They
       | recommend an 850W PSU for the 4090? A 750W PSU for the 4080? Who
       | is the intended audience for these cards?
        
         | haunter wrote:
         | 4k +144hz gaming or VR. This is probably the best GPU ever for
         | high resolution, high FPS VR.
         | 
         | It's cutting edge technology for sure.
        
       | supercoffee wrote:
       | Not the first time that Nvidia has released completely different
       | GPUs under the same name. Around 2006, they had 2 existing
       | variants of the 8800GTS(320mb and 640Mb VRAM). A year later, they
       | launched a new 8800GTS with a newer GPU and 512Mb. The newer card
       | was much faster than the both older versions of the card. I can
       | only imagine that this caused lots of confusion for uninformed
       | consumers who might think 640 > 512 > 320.
        
         | 0x_rs wrote:
         | There's also the MX150 case more recently.
         | 
         | https://www.notebookcheck.net/Nvidia-has-been-sneaking-in-sl...
        
       ___________________________________________________________________
       (page generated 2022-10-14 23:00 UTC)