Maybe you can elaborate on that, just from a technical standpoint. What makes Turing-complete language essential to create a DAC?
Well... In plain English I would explain it in such a way:
Imagine 2 people, Alice and Bob. Alice knows only how to add and subtract numbers, but Bob knows how to add, subtract and
multiply them.
If u ask Alice to calculate 2*3, she will do it in such a way: 2 + 2 + 2. Or 3 + 3 if she is smart enough. Obviously, Bob would just multiply these numbers.
Now let's ask them to calculate 2.5 * 3.72. Alice won't be able to do it, but Bob will do the same without any problems.
U see there is a difference between Alice and Bob, coz Alice, unlike Bob, is Turing-incomplete.

Alice can normalise the decimals, i.e. to 250 and 372, add recursively, then adjust the decimal point back afterwards. As all numbers in computers are Hex or ultimately binary numbers, and they have no decimal points, that's how it works anyway.
I really dont understand what you are going on about with your claim of a DAC and Turing complete languages. You can create a language targeted to perform a specialised function and not be Turing complete. The whole point of that term is that the language can do
any logical function. Either you your deliberately confusing the subject or you're very poorly made your case. possibly both.