[sdiy] Chat GPT Image analysis.
René Schmitz
synth at schmitzbits.de
Wed Oct 18 22:16:46 CEST 2023
Am 18.10.2023 um 20:23 schrieb Quincas Moreira via Synth-diy:
> Interesting, but my experience is not that it continues my sentences,
> rather that it replies to my queries with very useful information and
> ideas. I know nothing is original and it’s derived from farming
> existing information, but the result is far more interesting,
> engaging, useful than simple predictive text. And it has already
> learned language it was not trained on, etc. I’m not scared of it, but
> I’m intrigued and interested, it’s neural network and seems to be
> evolving beyond what even its programmers expected.
If I understand this right, to turn this "continuation enginge"(aka GPT)
into a chat bot (aka ChatGPT), it's actually trained on pairs of
Question / Answers, so it continues what the most likely answer in
response to your question is.
This training data is (was) largely prepared by humans (so far). I think
some groups has harnessed training data prepaired by an AI to feed into
their own project greatly reducing the cost for that.
Best,
René
>
> On Wed 18 Oct 2023 at 11:43 cheater cheater via Synth-diy
> <synth-diy at synth-diy.org> wrote:
>
> > I have to say I am very excited to see where this GPT thing goes
> but also a little frightened by it.
>
> most people who say they are either excited or frightened by GPT say
> that because they are mystified by the software, in turn because they
> don't know what it does. So let me give you a short description.
>
> My background: I worked as a software engineer in some of the most
> famous AI startups on the recent market, which created less public
> competitors to GPT and ChatGPT.
>
> The short of it is: remember on your smart phone, when you're typing
> out a message, and it shows you the next word you might type above the
> keyboard? And you can tap it? Sometimes you can keep tapping and a
> sentence will come out? That's the core idea behind GPT.
>
> Basically, what GPT does - "Generative Predictive Text" is the
> original moniker which later got rebranded to sound more mystifying -
> is that given a start of a sentence, it finishes that sentence in the
> most expected way.
>
> So let's say you start with:
>
> Trees are
>
> GPT has read every text on the planet. It has a frequency table of
> every word that comes after "Trees are". Example continuations are:
>
> Trees are green ... (rank 72)
> Trees are large ... (rank 1)
> Trees are wooden ... (rank 15)
>
> It finds out the most popular word after "Trees are" and tacks it on.
>
> Then, it repeats it with the next one. For example, let's say the most
> popular word was "wooden". Then the new prompt for it is:
>
> Trees are large
>
> continuations for this might be:
>
> Trees are large plants ... (rank 7)
> Trees are large, green ... (rank 52999)
> Trees are large and ... (rank 122)
>
> and so on.
>
> Now OpenAI's GPT actually takes more context than two words. It'll
> look at the whole paragraph you put in, and figure out the next most
> probable word to tack on to the end. But it only ever does that: it
> goes one, word, by, one, word.
>
> GPT isn't smart. It doesn't know what trees are. When you ask it what
> trees are it doesn't think to itself "hmm, what is my definition of a
> tree, an object I know of?". For GPT, trees don't exist. It has no
> object permanence - like a toddler. If we started a campaign, where on
> every forum, mailing list, news website, and encyclopedia we say that
> trees are made out of metal, GPT 5 will soon enough start telling
> people that:
>
> Trees are made out of _____ (inserted most popular word: "metal").
>
> It's like the kid taken to the blackboard that doesn't know how to
> answer the teacher's question: "Johnny, what is the capitol of
> Colombia?" "It's... uh... er... uh..." (2 minutes pass) "OK, Johnny,
> B...." "Berlin?" "Bo...." "Bo...dapest?" "Bog..." "Bog roll!"
>
> There's no reason to be scared of a precocious phone keyboard.
>
> And it isn't going anywhere, because interesting output requires
> operating on concepts - not just doing guess-the-next-word.
>
> GPT has one great application: it's great for if you want to be lied
> to. It's great on assignments like "tell me a sci fi story" or "tell
> me about faeries". But otherwise it has the IQ of an absolute idiot.
>
> If you want to understand how GPT "thinks", play an online game called
> Semantle. (google it, I don't want to put in links and end up in spam
> folders). Once you've won a few games, you know a little bit about how
> predictive text sees the world of words.
>
> On Thu, Oct 12, 2023 at 10:27 PM Kevin Walsh via Synth-diy
> <synth-diy at synth-diy.org> wrote:
> >
> > A quiet week so...
> >
> > OpenAI.com GPT4.0 has just released image analysis.
> >
> > I tried it for a VCO circuit (Schmitt/Inverter) and it gave a
> decent explanation of the circuit.
> >
> > I got it to write Arduino code for a MIDI controlled baby8
> sequencer with nothing but prompts.
> >
> > I have to say I am very excited to see where this GPT thing goes
> but also a little frightened by it.
> >
> > Thoughts?
> > _______________________________________________
> > Synth-diy mailing list
> > Synth-diy at synth-diy.org
> > http://synth-diy.org/mailman/listinfo/synth-diy
> > Selling or trading? Use marketplace at synth-diy.org
>
> _______________________________________________
> Synth-diy mailing list
> Synth-diy at synth-diy.org
> http://synth-diy.org/mailman/listinfo/synth-diy
> Selling or trading? Use marketplace at synth-diy.org
>
>
> _______________________________________________
> Synth-diy mailing list
> Synth-diy at synth-diy.org
> http://synth-diy.org/mailman/listinfo/synth-diy
> Selling or trading? Usemarketplace at synth-diy.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://synth-diy.org/pipermail/synth-diy/attachments/20231018/bbbaad77/attachment.htm>
More information about the Synth-diy
mailing list