Open AI as big as problem as they keep trying to live in the future. The problem is eventually these language models are going to be sufficient for most people and they're going to be capable of being ran on their personal computers. A few years ago, 16 GB of vram couldn't really do s**. Today I can actually use it for decent amount of stuff. I do use my apis. But I mean you fast forward a few more years if memory doesn't become ridiculously expensive. Everyone's going to be able to have their own language model they won't need to pay a monthly fee. By the time they get where they want to be. 99% of people are going to be able to run it personally.
Of course we're still going to need Enterprise level language models. We are still going to be using those apis. But significantly less often. Imagine having opus 4.5 level abilities of coding on your personal computer at all times. Within 2 years that'll fun on the average GPU.
They are going to have a really harsh reality in the future. Nothing against GPT. It is a great language model. I am having some issues with codex and 5.2. it likes to think too much. But it's not bad especially when you drop thinking after creating the task list.
I don't think they're worth that much though because of all that. They aren't really holding that much other than assets. Are they really holding on to 800 billion in assets? I guess you could argue that the vram prices have went up so maybe lol
Open AI as big as problem as they keep trying to live in the future. The problem is eventually these language models are going to be sufficient for most people and they're going to be capable of being ran on their personal computers. A few years ago, 16 GB of vram couldn't really do s**. Today I can actually use it for decent amount of stuff. I do use my apis. But I mean you fast forward a few more years if memory doesn't become ridiculously expensive. Everyone's going to be able to have their own language model they won't need to pay a monthly fee. By the time they get where they want to be. 99% of people are going to be able to run it personally.
Of course we're still going to need Enterprise level language models. We are still going to be using those apis. But significantly less often. Imagine having opus 4.5 level abilities of coding on your personal computer at all times. Within 2 years that'll fun on the average GPU.
They are going to have a really harsh reality in the future. Nothing against GPT. It is a great language model. I am having some issues with codex and 5.2. it likes to think too much. But it's not bad especially when you drop thinking after creating the task list.
I don't think they're worth that much though because of all that. They aren't really holding that much other than assets. Are they really holding on to 800 billion in assets? I guess you could argue that the vram prices have went up so maybe lol