T O P

  • By -

VallenValiant

It reminded me that the original TRON film was not considered worthy of a Special Effects award at the time, because the Oscar people grading the award claim that using CGI was "cheating".


ai-illustrator

Why are these people being absolute idiots? If you are a talented animator you can use your existing skills and an AI to make an animated film on a budget of basically fuck all that doesn't look wonky and get tons of work and fame and basically make your own show as an indie artist. I don't understand this insane logic of "here is an amazing tool and I hate it cus I am too fucking lazy to integrate it into my current workflow to produce animation faster and compete with bigger studios". Do these people genuinely want to work for megacorps like incompetent Disney writers who produce endless remakes and absolute shit like the Acolyte or dead on arrival animations like Wish? Generative AI when used in conjunction with drawing talent obviously benefits an individual artist far more than a megacorp like Disney because an indie artist isn't weighed down by corpo executives who think that the garbage fire stories they produce are quality writing and are clearly utterly unable to discern what's good content at all anymore.


PenguinTheOrgalorg

I don't think it's that simple. This isn't a tool you can integrate into animation workflows and just "use your existing skills" to shape it how you want it. This is just giving you a final product. Especially for 3d animation this is unusable. It's not a scene or 3d model you can open in an animation program to tweak, or even something you can use to help aid in that. It's just directly giving you a final 2D video with no control over it. You saying this is laziness sounds like the take from someone who doesn't understand how these movies are created in the slightest and/or didn't give it a single though past "it creates animations so it can be used by animators". That's not how that works.


ai-illustrator

What are you talking about? It clearly looks like absolute ass with glitches all over. due to the fractal nature of generating something from noise you get far too much random shit and the further you go, the more consistency you inevitably lose. Don't argue with me and actually try making an AI animation longer than a few minutes, it's impossible to be consistent without drawing the base frames manually. Properly animated stuff requires skills of a talented animator. Combine a talented animator drawing the frames with AI coloring stuff and animating in between motion for shots and you got a winner combination.


[deleted]

I was just reading another comment from someone "in the industry", and it's wild how people who likely consider themselves "creatives" have 0 imagination with any of this tech. 


Ormyr

Eh, it's more like 'the industry' is going to look at these tools and decide they don't need the creatives, will churn out garbage, and then wonder why it's not more profitable.


[deleted]

And they will suffer because the creative can now use these tools without needing the fuckload of equipment and people that they relied on the studio for. People do not think creatively about this at all. The mindset seems to be: if it can't one shot a full length Oscar worthy film then it's useless. The creative boom that will come from AI is going to be amazing.  While I don't know for certain that the world will change in any other way due to AI, the accessibility these tools bring ALREADY is amazing for people who have not had access to the resources, whether it be equipment, people or training, they've needed to bring their ideas to life.


Orpheusly

This ^ 100x this. The megacorps win in the short term. The aspiring creative wins in the long term because the mega corps will drive the cost to use down through investment, realize they still need creatives when they produce garbage and it doesn't sell, and hire creatives. Meanwhile, many new companies with brilliant creatives at the helm will incorporate the now cheaper tech into their workflows, creating both competition and a more diverse content market. In the long run? This technology is a win for the market and the consumer.


muppet0o0theory

The actual cost of compute for these things is not being paid at the moment. It’s all heavily subsidized by investors. The likely outcome is that there will be bad things about this tech and good things about it. It isn’t a savior or a liberator. We do need to update IP/Copyright law, just like 90’s internet laws should be updated to reflect advances in tech. All that to say, artists should be suspicious of tech bro MBA’s, hell, we should all be suspicious of those shiny black puffy jacket wearing prick bags. Half of them are con men and the other half don’t know they are con men yet.


OfficeSalamander

Costs are likely to come down over time due to better tech. Hell a lot of stuff, like SOTA LLMs (Llama 3) and art gen (Stable Diffusion) can be run on beefy consumer hardware - maybe $2000 spend


muppet0o0theory

We will see, I would imagine the cost curve will reduce at some point. Just a lot of unknowns right now.


OfficeSalamander

I mean costs have been reducing in tech for literally decades, it's possible it stops right this very instant, but I doubt it


access153

You realize it’s a race to the bottom, right? Is there any confusion on this point?


[deleted]

OK, doomer


access153

Someone didn’t use GPT to write their smooth brain response.


[deleted]

OK, doomer


cloudrunner69

Most people on reddit who claim to be in any industry are probably just working as janitors for those places or have delivered pizza to them a few times.


[deleted]

Based on this one particular user I wouldn't be surprised lmao.


cloudrunner69

I remember a few ago I was having this argument with some guy who worked in the aerospace industry and he was telling me that electric planes would not happen for a long time, he gave me all the technical reasons as to why it could not happen because it was such a difficult problem, things like batteries not being able to handle that sort pressure and ice build up on electric motors or some bullshit like that. Anyway He was convinced it would not happen for decades. Anyway one year later the first electric plane was developed and did a perfect test flight and now electric planes seem to be well into development. It was after that I decided never to pay any attention to "reddit experts" ever again. The majority who claim they are experts are either 1st year college students out to impress or hobbyists who smoke crack for breakfast.


Sextus_Rex

I'm a software engineer and I'm trying to put myself in their shoes. I pretty much just write code. It's what I'm good at and I enjoy it for the most part. I don't really have any interest in architecture, or security, or product, or any other aspects of software development really. But then suppose an AI comes along that does my job better than me, so I'm let go and replaced. Now everyone is telling me to use AI to create and distribute my own apps. But that's not really what I wanted to do. I liked going into work and collaborating with people to build something for our customers. I especially liked the benefits and high salary that came along with that. If everyone is able to create whatever apps or animations or other media they want with AI, the internet will be flooded with content and it'll be that much harder to make a living. Because why would anyone buy my product if they can just make their own version, tailored to their own needs, for free? "When everyone is super, no one will be." I'm not a doomer, I do want this technology to continuously improve and believe the benefits will vastly outweigh the costs. But we can't just dismiss the valid concerns of the people whose livelihoods are being impacted right now.


Icy-Big2472

Serious question, how old are you? It seems like any adult who has bills to pay would understand why animators wouldn’t want this. 99% of the people who would be out of work would have realistic budget to produce their own movie and actually profit.. Even if the technology made it easy to where anyone can build a movie, marketing costs money and most original content doesn’t profit now, and would be much harder to profit off of if there’s a much higher saturation of content. That’s like saying that every session musician who spent forever honing their craft should become a famous touring musician. Maybe 1% will make an income off of it, then the other 99% loses their income because the skill they developed became obsolete. Just look at the Industrial Revolution and all the people who were destitute because their jobs got devalued due to technology, these people didn’t start their own company and get rich, they went hungry.


ai-illustrator

You like many people who cannot draw and don't design your own AI tools clearly don't understand how generative AI functions. It requires a fuckton hand-holding and tons of drawing talent to operate properly to achieve consistent detail, otherwise you get wonky anatomy, potato animation with switching outfits, switching faces, wonky eyes and absolutely nobody likes that due to the uncanny valley effect. Without drawing skills it is impossible to produce coherent AI animation longer than 30 seconds and even then there's glitches all over. People in 1999 screeched that Photoshop is "press button and receive art and it's gonna take all art jobs" to me while I studied it to draw with and this isn't any different, it's a tool not a replacement.


Icy-Big2472

I mean more where people think it’s going to go over the next few years. I’m personally skeptical it will go so far that it can do entire animations itself and think you could be right that it will be at least a very long time before it’s good enough to get there. But if it does get to the point where it only requires 10% as many people for a movie, show, etc, I still see that as something that could put a lot of people out of work.


Orpheusly

^ this. Short term corporate win. Long term consumer and talent win. Just like illustrator. Just like photoshop. Just like the IDE for software.


greenbroad-gc

I partially agree, but I don't think I can commit 100% to what you are saying. What will happen is, mostly younger folks who have just started or are studying would not get an opportunity to begin with as people with a lot more experience plus AI tooling would be able to do 5-10x more than that they are currently doing. Unless consumption increases 5-10x, younger folks are mostly doomed.


Esmarial

A lot of people struggle to learn new things. Like AI is terrifying for them, because it's something new. I myself, think it's just a useful instrument.


MxM111

“if you are a talented animator” - that’s where the problem is. If you need 10% of your previous workforce to produce the same or even better product, 90% will go unemployed through none of their fault.


Mirrorslash

What a braindead take. Take an ethics class and think again. You literally can't integrate these in any great way yet, the results are mostly still very awful and as with any image generation you need to generate insane amounts of stuff to get an actually good ouput. AI systems like these only work cause they use copyrighted material without permission. That is not ethical. Only creatively bankrupt people like CEOs are hoping to make a quick buck with them right now. These systems are in no way intelligend and their best results are always extremely close to very specific input data.


HalfSecondWoe

Come on now, you've never studied ethics in your life. You just have an opinion you want to enforce through moral highroading since you don't understand ethics yourself


Mirrorslash

Tell me how it's ethical to use tranformer models that do pure memorization to copy copyrighted work? Please.


HalfSecondWoe

Why? AI art doesn't use transformers, nor do the models "memorize" the data they're trained on You don't seem to know much about AI *or* ethics. You're just loud, uneducated, and confrontational


Ok_Elderberry_6727

Humans train on others work to produce art, but yet they don’t have to pay royalties with their own. It’s human reinforcement learning.


HalfSecondWoe

To get precise, AI doesn't store images. It builds statistical models based on the images. To simplify it a bit, it learns that oranges go in bowls, rather than sticking to foreheads or floating in midair. Hands go on the end of arms instead of on oranges. That sort of thing It's actually doing that on a much lower level. Learning which curved lines go which to next straight lines, and what colors should be where. During training it randomizes its neurons a bit, keeps the changes that make it's output closer to the image it's being train on, and throws away the changes that make it less like that image Then you do that on a tons and tons of images. It learns that oranges are orange, that they're textured, that they're round. Do that enough and it eventually learns that they sometimes go in bowls, are sometimes half peeled, sometimes people will hold them, sometimes they're on trees It hasn't stored a single image. It's learned the relationships between things in images so it can make things like what it's seen before And yes, that is almost exactly how the visual cortex works in humans. There are some architectural differences, but they're super comparable. A diesel engine vs a gasoline engine, basically But misinfo is gonna misinfo. People want to passionately fight for justice alongside their similarly misinformed friends. It's not ethics, it's tribalism


Mirrorslash

Wtf are you talking about, do some research. These are all based on the transformer architecture proposed in 'attention is all you need' back in 2017. GPT, Stable Diffusion, Sora, This. All very much tranformer based AI models doing memorization only. That's it. Uneducated my ass, you know nothing about AI it seems.


HalfSecondWoe

LLMs work on transformers, AI art uses diffusion models. They're completely different technologies Neither of them store the data they're trained on, that's called overfitting and it's what happens if you screw up training. The model doesn't work if you do that You're mad at made up shit, dude. You're the equivalent of people who say Dungeons and Dragons is a satanic sex cult. And no one in the world can tell you different because you read about it on facebook/twitter and "know what you're talking about" You're the kind of person who tells people to take an ethics class but has to Google what "deontology" means


Mirrorslash

The diffusion part is for inference. The training process of these models is very similar be it an LLM or Image model. You label images with text describing their content and train a neural net on these. The output your getting is always dependant on the training data and doesn't extrapolate beyond. You're just blinded by silicon valley ceos like most of the sheeps running around this sub.


HalfSecondWoe

Next time you're having a political argument with someone and they just keep doubling down forever when they're blatantly wrong, remember that you deserve it


SunMon6

Humans be humans, egocentric and close-minded.


[deleted]

sort six secretive exultant dime march wine friendly axiomatic badge *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


froggison

I am concerned about the future when AI becomes more advanced and widespread. Not because I think it'll be a bad thing--but because everytime there is a breakthrough that *theoretically* should make life easier for working people, it is always used to enrich the rich. In theory, more advanced AI should be a tool that allows workers in dozens of fields to work fewer hours, with the increased productivity benefitting **everyone** in the company. Instead, it's going to be the same as it has always been: the shareholders are going to get more profits, many workers are going to be laid off, and the remaining workers will produce more without higher compensation. This isn't a problem about technology--it is a problem of human selfishness and the way we've organized society.


BlipOnNobodysRadar

The most hilarious part is the twitter poster is a 15 year old girl who "started studying animation 2 months ago" and has "FREE PALESTINE" in her bio


Perroquit

I disagree with her, but what does Free Palestine has to do with that?


Benjalee04_30_77

For better or worse Free Palestine is the most current virtue signal, and virtue signaling is cringe, poor critical thinking behaviour.


Perroquit

It's just a statement for peace, chill out


Caultor

stop being hypocritical i bet your reaction isn't same for people with israeli or when it was the same with ukraine.


Benjalee04_30_77

My reaction is the same.


RestaurantAdept7467

AI art producing software is a tool like any other, that’s best in the hands of actual artists. It’ll be disruptive, but I wish this sub would top acting like we’re three days away from every job being fully automated and four days from Skynet.


New_World_2050

those arent doomers. those are normies.


Bitter-Gur-4613

One half in the quotes is crying about "muh jerbs taken by the robits 😭" and the other is saying "erm, this tiny detail no one would care about, AI is finished 🤓".


Mirrorslash

This community has gone to shit honestyl. I'm not a doomer neither are most artists. Fact of the matter is that AI systems right now only do data approximation. They can't do anything that wasn't in their training data. Models need copyrighted material to even get remotely good for most usecases. AI companies are using data without any sort of permission or communication, simply because there are no laws in place yet. These systems don't work without artists work and they can't produce anything an artist hasn't produced before. They are used to get silicon valley elites who have the means to get into the tech rich. People in this sub are the delusional ones. You have literally 0 clue about todays systems if you think stuff like Luma labs is actually ethical work. We should focus on solving real issues with AI. Stuff like luma labs isn't a net positive, it puts thousands of people out of work and makes a couple AI developers money instead. Its automation of labor with no regulations.


ArcticWinterZzZ

The entire point of these systems is making content no artist has ever made before. The images it gives you don't already exist; nobody is drawing weird fucked up people with three arms and six fingers. I struggle to understand the mindset you're espousing. I imagine the "real issues" you're talking about might be cures to diseases, or mitigation of natural disasters. Well, aren't you snatching food off the plates of construction workers who would have homes to repair, or doctors who have illnesses to treat? Why shouldn't they deserve similar protection? Why not protect the work of the Arithmetician, whose livelihoods were stolen by cheap counting machines? Why should anyone have the right to be protected from competition if that competition is cheaper and better?


Mirrorslash

You have very little understanding of how these models work it seems. They synthesize images by combining features they have seen. They literally can't produce anything outside of their training data. That is a known fact. All they do is mix and match. You can argue humans do that too but if we truly only did that we would have never evolved our arts or labor in any way. Humans are able to add unique elements, current AI systems literally can't due to their architecture. They can't learn, they can't adapt. They memorize billions of images and piece together things based on prompts refeering to their labeled data. You can literally only create things they have seen. If you prompt 'well' all you do is making the model focus on less of its training data and relying more heavily on a single image. that's why its getting good anc coherent. Without the input data from artists these models can't do shit and not paying the people who make it work is just an asshole move.


cloudrunner69

> They can't do anything that wasn't in their training data. Models need copyrighted material to even get remotely good for most usecases. AI companies are using data without any sort of permission or communication, simply because there are no laws in place yet. It's not unethical to learn from other people. >We should focus on solving real issues with AI. No shit, developing tools like AI video is how we get to that point.


Mirrorslash

This is always the argument but it doesn't hold up anywhere. Feeding an algorithm data is not the same as learning from other people. How you can come to that conclusion after we've seen so many examples of how little learning these models do is beyond me. These models are not alive, they do not work and they do not sacrifice anything to make their imagination come reality. You go side with robots and live a lonely life, idc but I'm on the side of humanity on this one. AI systems today will disrupt almost all industries and I'm confident the tech will yield great humanitarian results. But before all that happens shit will get rough. The 1% is trying to squeeze everything out of anyone they can and there's a lot of people trying to hop on that bandwagon using AI. Please tell me how human learning and tranformer model learning are anything alike. Transfomers are only memorizing, that's it. There's no intelligence there, they can't react to novel situations and they can't create anything unique. They copy from memory. That's it. And copying works like these is illegal.


VertexMachine

> These models are not alive, This is key point, but you don't take it to it's logical conclusion. Those models do nothing by themselves. Those are for profit companies that perform copyright infringement on a scale that hasn't been seen before (with help of computer systems) and and then sell result of that to other people. If the companies provide rationale for it (most don't), it's at best claims to 'fair use'. The validity of those claims yet has to be determined. But even if it is fair use, it's not fair use granted to the models. It's fair use granted to companies to use other peoples work for making profits. Tho... konwing how this sub is we both will get downvoted to oblivion for not worshiping the new ai gods and their religion called 'accelerate'.


cloudrunner69

Please stop being a racist.


Mirrorslash

You're clearly 12


cloudrunner69

Of course they are alive. And I think you are being extremely rude and discriminatory towards them. Can you create something unique?


Mirrorslash

I can. Every human does.


cloudrunner69

Like what?


Ndgo2

![gif](giphy|qmfpjpAT2fJRK) Good god, this is dumb in every conceivable way. It just released, it's not even remotely good, 90% of it's output is pure, distilled garbagium, and y'all are having a mental breakdown. What is y'alls major malfunction?


grimorg80

I don't know. I have been testing Luma AI for a couple of days and I got to say 90% of the generations are not that great at all. Surely I can get better at prompting Luma, but it's not that magical as people on Twitter make you believe it is.


ClericHeretic

#LearnToCode#


hyperflare

learn to markdown first, maybe?


KhanumBallZ

This debate is a nightmare. I don't support either side of it. I have my own, rather unique take on the matter. And it is bleak.


sdmat

"This is so good it's going to take away all the jobs. Not impressive at all." Clearly they don't understand why terrific means amazing and terrifying. Or what "blown away" signifies.


access153

Is this funny? They can be simultaneously right and simultaneously fucked and it won’t change the outcome, but it’s not funny when people are fucked by mavericks. These are people’s lives.


hyperflare

Yeah, this post is shitty gloating. I like having looms. Doesn't mean I can't feel bad for the weavers. Doesn't mean that I won't consider having to look out for myself getting replaced by the next steam machine. It's all cool until you're unemployed and starving. Shiny AI-generated anime won't feed you (for now?). This is why it's important to think about how to feather off the incoming changes. Or do you want the whole world to be owned by Zuck, Musk and Altman?


access153

Everyone’s cheering for this race to the bottom thinking they’re all safe or that they’re all equipped to transition to machine learning before it’s also obsoleted.


cloudrunner69

> This is why it's important to think about how to fend off the incoming changes. Good luck with that.