<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<div class="moz-cite-prefix">
<div class="moz-cite-prefix">Am 24.04.26 um 20:45 schrieb Eric
Brombaugh via Synth-diy:<br>
</div>
<blockquote type="cite"
cite="mid:f59f5cee-cc86-4c40-835e-2eb675b474ef@gmail.com">A more
interesting topic to me is the use of machine learning and AI
methods for realtime synthesis / effects and music creation. <br>
</blockquote>
Hmm, there we are again, where Roman wrote "I really like to know
exactly how the thing I'm making is doing what it's doing in every
detail." Also in synthesis for me it is the most satisfying part,
that I know what I am doing. I don't want to use, what comes from
the blackbox - even if it sounds "cool". I want to have it on my
own.<br>
<br>
<br>
<br>
Am 25.04.26 um 12:18 schrieb Sean Ellis via Synth-diy:<br>
</div>
<blockquote type="cite"
cite="mid:KL1PR02MB6663D5E0BB6C3BE9A5B1BCBEB3282@KL1PR02MB6663.apcprd02.prod.outlook.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<style type="text/css" style="display:none;">P {margin-top:0;margin-bottom:0;}</style>
<div class="elementToProof"
style="font-family: "Calibri", "Helvetica", sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">There
is nothing in the currentĀ LLMs that help me.</div>
</blockquote>
<br>
Here are a few more observations on what LLMs are capable of and on
what not. I work (or rather, worked) in technical support for
enterprise ticketing software that is used, among other things, for
... technical support. We introduced an API for LLMs quite early on
(four years ago) and tried to use it for our own tasks as well. Our
goal was to create generic HowTo documents based on individual
support tickets that had been successfully resolved. Unfortunately,
we found that while LLMs can do many things, there is one thing they
cannot do: take a specific case and derive the general problem from
it. An LLM can derive a solution for an individual problem from
generic knowledge it has previously learned. But an LLM will never
recognize the generic concept behind a specific case. That requires
intelligence, and LLMs are not intelligent. <br>
<br>
</body>
</html>