Search content
Sort by

Showing 20 of 50 results by klosure
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 17/01/2018, 21:43:48 UTC
Cheddur allows you to link wallets, exchanges, and other services that support AGRS so that new users can easily get started

Security reminder: never give your private keys or wallet files to an unknown or untrusted wallet app, especially not one that advertizes by spamming every altcoin forums

As far as AGRS is concerned, there is nothing that this new app can do that the official Omni Wallet can't do equally well. Play it safe.
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 23/08/2017, 19:45:32 UTC
@Ohad: no comment about HMC's reply?
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 14/08/2017, 19:59:56 UTC
@Ohad, I have a question. I've skimmed through the IRC logs of the last two months and found a very promising conversation between you, HMC and Stoopkid on 6 June about the maths of the new Tau. As far as I have been able to understand, HMC and Stoopkid where trying to understand how MSOL is able to verify higher order programs in spite of not being itself a full second order logic.  And you were answering something about the higher order program being fully beta-reduced and encoded as a tree which apparently was supposed to turn it into a first order expression. Then the tone quickly deteriorated, and you ended up rage quitting the conversation without finishing your explanation.

As a result three questions / remarks which I found very interesting were left unanswered / undevelopped to my disappointment.

Quote
23:37 < HMC_a> so yes or no, is there a second(+) order expression which cannot be consistently proven in any mso framework?  In general is full second/third order collapsible to msol, or isn't it?

I expect this one must be discussed somewhere in the works of Bruno Courcelle so you could point us to some theorem or conjecture if that can save you some time with the explanations. Don't worry about making it understandable for the mere mortals. I'm happy with an explanation that I can't fully comprehend so long as HMC and you seem to be on the same page.

Quote
22:17 < HMC_a> but maybe I'm missing some collapsability magics, somewhere, so perhaps you could give a concrete example of verification of such an example?

I concur wholeheartedly with this one. A simple example of a second or third order program in STLC+Y with limited inputs, executed against all input combinations and encoded as a tree, and then the demonstration of how msol verifies it would be hugely helpful getting some intuitive understanding of how it works. That would give us the tools to try to find a case where it doesn't work (there shouldn't be any). You can recycle that explanation for the whitepaper, so no time wasted here.

Quote
22:18 < naturalog> in bohm tree you dont have lambda terms
22:18 < naturalog> they're all reduced already
22:18 < naturalog> so the higher order rules disappear and remain only with true/false or other info

What kind of reduction are you talking about? How can the result not have lambda terms? Do you mean no lambda abstractions / bound variables? How do you guarantee that?

I understand that you probably don't want to bother making these explanations if you feel the tone of the conversation on IRC was not respectful, but please remember that IRC conversations are more or less supposed to serve as documentation and research log in the absence of actual documentation, and that they benefit the entire community, including investors and potential future devs who can help you on the code, so it would greatly help if technical questions are actually being answered.

I'm really glad to see that you, HMC and Stoopkid are trying to get back to discussing the maths, but at the same time your (as in all three of you) lack of maturity is really appalling. You are reading like an old divorced couple who hate and yet still love one another, pulling each other's leg every few sentences,  and lacing otherwise perfectly factual questions and answers with half concealed vitriol and sarcasm. Rage quitting and yet coming back for more. Really not constructive. A giant waste of time and intelligence.

If you really care as much about Tau / Autonomic as you all pretend, you should be able to keep your egos in check and apply a minimum of self-control to make sure that conversation doesn't turn into a nasty troll fest before it has the time to reach its conclusion. After all, if both of your are confident about your maths, it really shouldn't be a problem to set the record straight. And if you are not, it's fine (who am I to judge?), but then stop acting as if you are Field medalists and be humble and honest about what you don't know.
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 15/06/2017, 20:04:36 UTC
My apologizes but the report is under copyright , this is the report :
https://www.halfpasthuman.com/Hph_reports.html

Are there people who actually pay $99 for a collection of meaningless web metrics collected by a web scraper bot, and proceed to make investment decisions based on that?
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 21/04/2017, 12:21:48 UTC
Hi Ohad,

I read your roadmap for Tau, and am glad that you are repositionning your strategy toward a more inclusive bottom-up approach, and prioritizing the creation of collaborative tools for the community to start familiarizing itself with the specific programming paradigm used by Tau, as well as starting building models and applications. As discussed earlier, I think that we shouldn't be looking to create a collaborative environment only around Tau, but instead create an open platform in collaboration with other RDF-based self-amending crypto ledger such as Autonomic and even possibly Tezos if they get around to supporting RDF and OWL. There are several reasons to that. Let me re-iterate on that.

First, as the very slow progress of Tau has shown, this is a significant undertaking. Much more significant than you cared to admit initially. And it's paramount to use the little resources we have as a community intelligently by staying focused on building our common vision together rather than wasting time and resources building separate competing ecosystems.

From an economic and game theory perspective, it also doesn't make any sense to compete at infrastructure level since none of the economic incentives are connected to the underlying infrastructure, be it Tau or Autonomic. Should Autonomic make it first to market by a large margin, regardless of its relative qualities and shortcomings as per your and HMC's respective views, the obvious right thing to do (that all investors will rightfully require) will be to start building Agora on Autonomic, and later migrate it to Tau should Tau turn out to be a better fit. Basically, putting aside petty ego matters, it's in everybody's best interest that Tau and / or Autonomic make it to market asap, and that the whole ecosystem that was meant to come on top be equally suitable for either logical substrate.

But most importantly, as per my earlier argument that still stands unchallenged: no matter how different the underlying form of calculus, there exists a low enough abstraction level from which the code will be the same above which everything that's written for one of the platforms will work seamlessly on the other. One obvious argument that demonstrates this is the fact both projects claim to be able to recover OWL, if not directly at core-logic level due to different tradeoffs on expressiveness, at least at the level of the blockchain where unbounded iteration or arithmetic is recovered by continuation over a series of blocks. This at the very least establishes the fact that everything in the system that will be written using OWL can be shared between the projects. Although OWL has a limited expressiveness and isn't suitable to express complex behaviors, a huge amount of what a typical program intended for human consumption does isn't computational in nature and lands itself very well to being encoded as ontologies and linked-data sets in dynamic contexts. That means that the computational part that may involve different operations and patterns in MSOL and MLTT can be factorized out in a core-logic library that would be specific to each project with everything else written as generic OWL and shared. This is precisely what BOSCoin is doing by introducing a time-constrained FSM (TAL) to deal with all the stuff that OWL isn't able to express. BOSCoin may not be a "self-amending" / "self-evolving" ledger as advertised, but it got at least that part right: most of the stuff can be factorized out of the core logic and put in OWL format where it can be universally shared with other projects, and thanks to which it can reuse a lot of the already existing ontologies and datasets that have been created by experts of all fields in the scope of the Semantic Web initiative. Should Autonomic and Tau decide to build a common ecosystem using as much as possible existing Semantic Web standards for everything non-computational, we would be able to leverage all the tools developed for the Semantic Web like Protege, hire experienced ontologists from the Semantic Web community, and even start prototyping our stuff using a "naive Tau" approach by leveraging existing OWL reasoners like Fact++ or even BOSCoin's OWL+TAL engine when they deliver it.

A last argument that should clearly establish the need for a common ecosystem of generic programs is that in all likelihood neither of MLTT or MSOL are the silver bullet, the characteristica universalis sought after as the holy grail of logic. As research keeps making progress, even better calculi will be discovered that get always closer to the proverbial metal that the fabric of reality is made of, and we will want Tau / Autonomic to follow to get closer to the metal too. What is a "better" calculus? It could be something with a better balance of expressiveness and decidability. But it could also be a form of logic that allows to express programs as shorter strings, allowing to compress the entire Tau universe and bring it closer to kolmogorov complexity which is very likely to become the name of the game anyway as Tau starts looking for a useful form of Proof-of-Work and Agora brings online swaths of idle computing resources begging to be arbitraged.

If we have worked in silos with separate watertight ecosystems and programs infused with bits and pieces of the specific core logic they were designed to run on, neither Tau nor Autonomic will be able to upgrade when a better form of calculus is discovered and it will take yet another team and yet another project to fill the gap, leading to even more fragmentation of the industry and a general lose/lose for everyone involved.

I know I'm partially repeating myself. I know this is a touchy question. I know egos are involved and still scalded by the earlier dispute.
But this time around I really hope to get a substanciated answer - if not from both sides, at least of you Ohad - as of why I'm wrong, and why we should keep ignoring the other project in spite of the fact both projects have the same noble and philantropic goals, and only diverging views on technical aspects that, in hindsight, aren't all that important after all. Again, why not just burry the hatchet and admit that Tau has got two equally valid and promising research initiatives - which everyone else would consider to be an advantage - that could happily cohabit within a common higher level framework, with the community actively building on it? Even OWL has got many different subsets corresponding to different forms of logic and I'm not seeing anyone at the W3C making a fuss about it.

We have to design Tau as a calculus-agnostic project and get back the Autonomic folks on board. This is the only way for Tau to live up to its ambitions of universality. Ohad, HMC, any comments?
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 12/04/2017, 14:34:05 UTC
There was a good discussion between Arhur (Tezos) and BOScoin Team: https://bitcointalk.org/index.php?topic=1759662.msg18529378#msg18529378

Not sure what you consider to be "good" in that conversation. None of Arthur's questions about the protocol's alleged "self-evolving" properties are actually being answered. The first set of answers just harps on how OWL is great and lands itself well to formal verification, and has got a vast library of ontologies. And the second answer abruptly bails out of the conversation using a very lame "apple and oranges" reason.
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 20/02/2017, 19:19:45 UTC
how do you expect people to form well-formed proposals, that actually express what they meant and so on?
do you expect this to be an individual process for experts?
on the new tau we consider this a collaborative process for non-experts, the process of forming a proposal, aside the process of accepting it.

I think it's wishful thinking to imagine that non-experts will be able to amend the protocol or understand deeply what they are voting for. Tezos's solution apparently is to allow people who don't understand what they are asked to vote for to delegate their votes to someone they believe knows better and shares their perspective. Not a bad idea. At best, you could have experts rephrasing the decision in simpler terms in plain english, but then you would still need to trust that they are not misrepresenting the problem or showing a bias, and you would still need people to really try to understand what they say at a logic level which isn't a given, and then there is the laziness. As pointed in the interview, the DAO was a good example of how in practice, most people won't really bother voting.

a smaller point would be regarding votes. once you take a close look, you don't need them anymore Smiley

Would that be a solipsistic look or a totalitarian look or an omniscient look Wink ?
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 20/02/2017, 18:59:13 UTC
It might be the only thing out there that is close to tau.

Well, now there is also Autonomic Smiley

that said, there's still a big gap.
I agree. Although there is quite a bit of overlap on the fundamentals, Tezos seems to be specifically aiming at protocol governance, whereas Tau is both much more general in scope and closer to the metal due to the versatility of its RDF syntax which gives it the ability to blend naturally with semantic structure of which it is only a flavor. I think we can probably learn a lot from the launch of Tezos, their approach, their mistakes, and how things pan out with the nomic game.

i'd ask two things before decidability: expressibility,
I asked that already

and which parts of the system are amendable (is it the whole code no matter what?).
In the EpiCenter interview, he talks about layers, of which only the topmost layer that contains functional rules about the way the blockchain is working and the business logic would be self-amending, which includes consensus rules and voting. Based on that, I think all of the lower level stuff like network communication, overlay network management, DHT etc are hard coded and evolve following a regular software life cycle. This is actually made more explicit later in the interview where Arthur Breitman explains that only changes that involve governance issues are really controversial and subject to on-chain consensus, whereas more technical issues such as bug fixes are following the normal software life cycle and adopted following the normal soft/hard fork rules as people update their clients.

1. need to understand what they mean by "ledger" and in what it's different than "generic knowledge"
probably ledger = blockchain, generic knowledge = whatever state the blockchain is used to maintain?
Can you provide the context?


2. ocaml impl?! ... the easy but eventually-useless way? Smiley
He mentioned in the interview that he was inspired by Coq for the formal verification. Coq is written in OCaml so going with OCaml would probably have saved quite a bit of time. I can't find back the detailed team page of Tezos, but if I remember well there were some people from Inria, so that would also explain why they went for OCaml.

Just noticed that Andrew Miller is an advisor of the project. Wasn't he also following Tau? And there is Zooko too! What a small world. I wonder if HMC knows the tezos guys.
Post
Topic
Board Announcements (Altcoins)
Re: Tezos discussion
by
klosure
on 20/02/2017, 16:06:40 UTC
Tezos seems to have a lot of common points with Tau.
@arthurb: can you confirm if the core language is total functional and if it remains expressive enough to perform more general programming? If it's not, how do you prove mathematically that core protocol operations will always be well behaved and terminate for all input sets?
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 20/02/2017, 15:58:11 UTC
Actually they already have a functional system and released their code of Github. Arthur Breitman  started a thread here earlier this month and it seems that they are looking at doing a crowdsale in Q1 or Q2, so pretty soon. Let's ask him directly if Tezos is decidable Smiley
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 20/02/2017, 15:41:21 UTC
Really interesting interview of Arthur Breitman, the lead developer of Tezos.
Tezos is one big leap forward in the direction of Tau. It has a lot of common points with Tau. For instance, the protocol is self amending following a very similar process to Tau that is also inspired by the nomic game. Nomic is actually being discussed in the interview. Another important similarity is the use of a custom functional language practical enough to develop with and that comes with built-in formal verification capabilities, the goal being to ship code with a mathematical proof of its properties. Another very similar aspect and that is beautifully well explained in the interview is that blockchain networks are essentially all following the same protocol at a very high level, when you abstract specific details like the exact nature of the state that is being validated, and the exact way that the canonical version is being discriminated, blockchains are essentially all doing the same thing, so Tezos is coming with built-ins that can handle all the low level stuff like i/os as well as generic blockchain mechanics, and all the specific blockchain rules are left for the users to decide in the genesis block (they call it "seed protocol") and subsequent protocol amendments. This allows Tezos to implement Bitcoin, Ethereum and any other blockchain. Even the idea of writing program specifications as a set of constraints and letting the compiler come up with an implementation that is guaranteed to meet the constraints is addressed. A last thing: Breitman alludes at some point to the fact Ethereum is doing a mistake by attempting to be a "universal computer" that can compute everything, and explains that a blockchain network doesn't need full spectrum computation and that a subset of what Ethereum can do is sufficient. Although he doesn't point specifically to concepts such as turing completeness and decidability, and doesn't state where Tezos stands on that aspect, it sure seems like he is referring to that but keeping things simple to avoid perplexing the non-technical audience. Is Tezos decidable too?

Now, that makes a hell lot of common points with Tau, doesn't it? And they are planning to release this year!
Ohad, are you following Tezos closely enough? What's your take on their technology?
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 22/11/2016, 11:52:34 UTC
Catching up with latest developments. I've read some of the chat logs and the heated exchange between Ohad and HMC.

There are good arguments on both sides. HMC is right that MLTT + TFPL is a more conservative choice since it's already in use in a flurry of theorem provers and formal verification frameworks and languages. He is also right that MSOL's limitations wrt to arithmetic *could* (theoretically but no example is known) turn out to be problematic to express some categories of computations although it's unclear if that would really be a problem for practical use of Tau. Ohad is right that the TFPL constraints and the non-axiomatic LEM make MLTT + TFPL difficult to program intuitively, and somewhat amputate expressiveness in a way that's difficult to fathom and (seem) undecidable. Both approaches have shortcomings and advantages, and it's an excellent thing that both are being researched so that the project won't be stuck in a rut should one of the paradigms turn out to be unsuitable.

Now, where I'm really puzzled is that this difference of views has led to a complete breakdown of the project team in spite of the fact no party has been able to properly back their claims. There are good arguments on both sides but the so called "critical flaws" pointed by both sides are yet to be proven. Indeed MSOL over trees limitations wrt to arithmetic could be problematic, but is it a real problem? Modern computers are using bounded arithmetic operations over 32 or 64 bit registers, and can all be emulated with basic bit-wise logic operations, in a way which can be entirely flattened / unrolled and require no loop, and can be broken down into several rounds to allow execution over multiple blocks / cycles like it was planned to do with the original Tau design to recover turing completeness. If unbounded iteration can be recovered, why would arbitrary precision arithmetic be unrecoverable when the very thing to be recovered in both case is exactly the same: unbounded-ness? Now, wrt to the claim that type checking as used in MLTT+TFPL is undecidable making it very awkward to create proofs, it's also rather unsubstanciated: it is really undecidable? Without proof, the argument is moot. And even if it really was, does it really make it so awkward to create proofs in practice? Examples would be helpful. Surely the fact it's being used in Idris and Agda should hint to the fact it is at least somewhat usable.

It seems to me that both parties have good arguments, and that at the same time both also jumped the gun and decided that they were right and the other wrong based on incomplete premises propped up by ego matters. What's wrong with you guys? Whatever happened to "this too is why we need Tau"?. How can you promote the use of logic as the central tenet of Tau and yet fail to abide to it at the first occasion where a difference of views arise?

The most puzzling is that the debate has been exclusively centered around the artificially constrained question of who is right, and who is wrong (as if it was an axiom that one had to be right and the other wrong and only one form of calculus could possibly express Tau, nevermind the Curry-Howard isomorphism), all the while ignoring completely the fact that it's entirely possible for both views to coexist happily within the project. Tau is a language, based on RDF, and meant to come with a formally defined grammar and set of ontologies. As HMC has stated earlier in this thread, even OWL-DL can be recovered over Tau which means that, since both are expressed as RDF graphs, OWL-DL can be  transformed in valid Tau code by mapping and graph transforms. Regardless of whether the Tau client uses MLTT+TFPL or MSOL over trees under the hood, it will still have to support OWL-DL and programs written in the latter and that are not undecidable should work. I take OWL-DL as an example, but Tau is a superset of OWL-DL, and everything in this super-set should work equally well on any suitable logic.

The crux of the problem is that nobody knows exactly what is the spec of Tau. We know that Tau, as a language, is a superset of other languages we know it will have to support to be useful at all, like OWL-DL. But we don't quite know where exactly is its boundary. Is that a fundamental property of Tau to have limited recursion and no axiomatic law of the excluded middle? HMC will tell you it is, but not because it was the requirement or something inherently desirable, but because shoehorning Tau into MLTT+TFPL implies that Tau should have these properties. Same thing on the other side. Ohad will tell you that it should be a fundamental property of Tau to have free recursion and the law of the excluded middle but bounded arithmetic. Obviously, if everyone who comes with a new calculus gets to decide what Tau is so that it works with it, we are going to go round in circles forever. Tau shouldn't be what possible underlying forms of logic imply it should be.Tau should be that and only that which is required to express the family of programs that are both decidable and able to complete within bounded space (memory) and time on modern hardware regardless of whether doing so implies unrolling loops or emulating multiplications. Since "modern hardware" is a moving target, programs should be shipped with the size of the dimensions of the bounding box they are guaranteed to run in, together with other proofs on their properties, and should be able to run on any Tau client using any form of suitable logic. With a properly specified Tau language, you could have some Tau clients using MSOL over trees (assuming its suitable) with others using MLTT+TFPL (assuming its suitable), all capable to communicate over the tau-chain and crunch away happily. If that's not the case we are doing something very wrong, and falling for the classic mistake of retrofitting the requirements to fit the model, instead of creating the model based on requirements.

Of course this is easier said than done. As the debate between Ohad and HMC have illustrated, there are multiple ways that infinity can creep in, and different forms of logic enforce decidability by bounding calculus along different axis: recursion, or arithmetic. For a spec to be truly generic, it should include both a high level spec for developers (the language) which should be independent of the underlying logic, and a lower level spec with basic mechanisms to handle multiple forms of calculus which would include a form of generic continuation allowing the underlying logic to create deferred payloads that, stringed together, would allow to recover turing completeness. The underlying logic / implementation would know what to do when it compiles the code and how to package computation so that it runs in bounded time and space intrinsically, while allowing unbounded computation extrinsically with the blockchain used as the one and only carrier for infinity.

@HMC and @ohad: can we bury the hatchet and go back to the drawing board so that Tau will no more be bound to a single form of calculus? With such a model, both MLTT+TFPL and MSOL over trees can be implemented in parallel and trialed by fire. Tau would also become more future proof. After all, who knows: maybe a few years down the line, a new breakthrough in computer science will give rise to yet another form of calculus that could turn out to be even better. That we be too bad if Tau is stuck in a too specific model and fails to evolve.

We are all in the same boat, hoping to have a way to unleash logic upon the world and enable a new era of global rationality. It would be a pity if that grand vision didn't come true due to petty cat fights between project founders. You are two smart guys, and I know you know what really matters, and that this doesn't include petty ego matters.
Post
Topic
Board Announcements (Altcoins)
Re: [ANN] Lykke - a global marketplace
by
klosure
on 09/10/2016, 05:27:30 UTC
A few words on security:

Currently the Lykke app will let you create a wallet without explaining that Lykke doesn't actually control the wallet but stores an encrypted blob. People will naturally assume that there is a password recovery feature should they forget their password (which they always do). This is going to create a lot of trouble as people end up being locked out of their wallet upon losing/changing phone or inadvertently logging out of their wallet without making sure they know the password. It is critical that the wallet app explain unambiguously that the wallet itself will become unusable should people forget their password and that Lykke won't be able to recover it. The wallet app should also require people to backup their private key on paper (and put it in a safe place without showing it to anyone) so that they always have a last resort mean to recover their wallet should they lose their smartphone and forget their password and/or pin. Please Lykke make sure your app is fool proof. It will look bad to regulatory authorities if shareholders come complaining that they are not being able to exercise their rights as shareholders due to technical reasons like losing access to one's crypto wallet.

Another issue I see is that although the app allows you to backup your private key as a brain wallet, it doesn't currently offer a way to restore the wallet using said private key. At least I haven't seen anything like that after removing the app and trying to recover from the private key.

Last but not least Lykke needs a desktop wallet and this wallet should be open source and compile under Linux, Mac and Windows. Smartphone security is shaky at best, and nobody in their right mind is going to trust their smartphone with 5-digit figures, like what some people are probably investing in this ICO. I for one feel really uneasy about that. I have never trusted my smartphone with more funds than I usually have in cash in my physical wallet (so a few hundred dollars at most), and refuse to use Paypal or credit card from it for fear of this information being compromised. It feels really awkward to suddenly have to trust this poorly secured device with a large sum and hope it's going to be alright.

One more thing: I understand that LKK and other Lykke exchange based assets are really OpenAssets colored coins. Does that mean that they can be sent to any Bitcoin address and controlled using another OpenAssets client unrelated to Lykke? If that's the case it could be a solution to ship a desktop wallet quicker as developments could be based on some existing OpenAssets implementation.
Post
Topic
Board Announcements (Altcoins)
Re: [ANN] VOXELS (VOX) | The Official Coin of Virtual Reality and Voxelus Platform
by
klosure
on 09/09/2016, 11:42:04 UTC
Asking the same question for the 3rd time now in two weeks.

@jimblasko: why does Voxel need to be it's own blockchain, in particular a cripled blockchain that's controlled centrally and therefore pointless? You would have much less maintenance overhead, much more transparency, and real decentralization if you used tokens like Counterparty or Omni or Ethereum based tokens. It would also be much more simple for everyone to use and for exchanges to support. There are many good projects that use these tokens: Maidsafe, Synereo, Storj, Agoras... You may have future plans to implement custom features at blockchain level, but by then blockchain technology will have changed so much that your Litecoin fork will be totally obsolete anyway.
Post
Topic
Board Announcements (Altcoins)
Re: [ANN] VOXELS (VOX) | The Official Coin of Virtual Reality and Voxelus Platform
by
klosure
on 27/08/2016, 09:37:17 UTC
Seems like my question slipped through the cracks, so let me ask that again

@jimblasko: why does Voxel need to be it's own blockchain, in particular a cripled blockchain that's controlled centrally and therefore pointless? You would have much less maintenance overhead, much more transparency, and real decentralization if you used tokens like Counterparty or Omni or Ethereum based tokens. It would also be much more simple for everyone to use and for exchanges to support. There are many good projects that use these tokens: Maidsafe, Synereo, Storj, Agoras... You may have future plans to implement custom features at blockchain level, but by then blockchain technology will have changed so much that your Litecoin fork will be totally obsolete anyway.
Post
Topic
Board Announcements (Altcoins)
Re: [ANN] VOXELS (VOX) | The Official Coin of Virtual Reality and Voxelus Platform
by
klosure
on 25/08/2016, 17:55:26 UTC
@jimblasko: why does Voxel need to be it's own blockchain, in particular a cripled blockchain that's controlled centrally and therefore pointless? You would have much less maintenance overhead, much more transparency, and real decentralization if you used tokens like Counterparty or Omni or Ethereum based tokens. It would also be much more simple for everyone to use and for exchanges to support. There are many good projects that use these tokens: Maidsafe, Synereo, Storj, Agoras... You may have future plans to implement custom features at blockchain level, but by then blockchain technology will have changed so much that your Litecoin fork will be totally obsolete anyway.
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 18/07/2016, 02:40:27 UTC
Ohad your post on facebook today about compilers seems like a major shift from what you've been working on.

@ohad: can you please crosspost anything you post on Facebook here or on idni.org?
It should not be necessary to expose oneself to privacy abuse to be able to follow an open source project.
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 22/06/2016, 20:29:46 UTC
Quote from: ohad
Hang on,
"If Turing completeness is faulty" "there will be no other option than to exit the position of using Turing complete technology."
Which one are you talking about? Unless you mean both?
"Turing-completeness" is to "Turing-complete" what "Madness" is to "Mad": just a way to name the quality of something being Turing-complete.

From what I've heard, if you believe in Turing completeness, pick Tau-Chain.
If you believe in software being Turing Complete, pick Ethereum.
This doesn't make sense to me. Typo?

turing complete languages leave you helpless predicting what your code is going to do, except the "wait and see" way
Ohad, can you give us an example of this? Show us how it would work, because I don't understand how it works. Smiley

I can try to give you an analogy. Let's take cargo ships for instance.

Generalist ships can carry stuff of all shape and form with no restriction other than their size and weight limits: space rockets, airplanes, cranes, train wagons, whatever. It's cool because you can really carry anything. But it comes with its own difficulties: it's pretty much impossible to plan in advance how exactly you are going to arrange the things you need to load so that they will fit neatly and optimally and won't move during the trip. Sometimes your guys at the dock will manage to find a solution quick. Sometimes they'll have to load and unload things so many times that it seems like it takes forever. And that's when you don't have someone asking you to ship something so enormous that it blocks your docks for a week when you figure how to ship it at all. This type of ship is turing complete. In theory it could ship the moon. It could ship anything of any size if you have an infinitely large ship, and infinite number of dockers and an infinite amount of time. But in practice that doesn't really work like that, and the actual ships that end ups being used are all limited in size, and your docks have only that many cranes and that many dockers to help. Those ships are a watered down finite version of the real thing. So what happens with the real-life finite turing ships is that they work ok until it don't. And you can't really tell in advance when things are going to be smooth at the dock or when it will become really messy because the only way to decide how things gonna fit and what ship to use is to try to fit them in the ships. Of course there are many trivial shipments but the problem is you have no guarantees that a shipment will be easy to handle, difficult or downright impossible. But that's not the worse thing: the nightmare of turing-complete shippers is the outsourcing business, that is to say when another shipping company asks them to ship the cargo of their clients who may themselves be shipping companies outsourcing for other shipping companies and so on. Since they are all in the same business of turing-complete shipping as you are, they can't tell what the size of their cargo will be, and their clients don't know either etc. And if you yourself start to outsource unknowingly to one of the clients of one of your own client, that's where things start becoming self-referent and in some cases paradoxal, leading to capacity planning decisions that are sometime inconsistent.

Some other ships are specialized in carrying containers. They can carry only containers, and all the containers need to have exactly the same dimensions. No exception allowed. The containers are spacious, and inside the containers you can arrange things the way you want so it's not a problem for a large majority of the typical use cases. This type of ships is called total functional ships. The advantage is that even a 10-year-old could tell you just how many containers you can load on your ship if you give him the dimensions of the deck and the height limit so it's really easy to make sure that you always have the exact right capacity for your cargo, and your dock is working in continuous streaming loading container after container and ship after ship 24/7. But the problem is that you just can't carry anything larger than a container, making the shipping solution non-complete. Well, in fact there is a little secret: with some coordination you can carry anything of any size, but you'll need your customers to be smart and figure a way to breakdown the cargo into components that can fit in a container. You'll still be able to ship a plane, a rocket, the moon or even the whole infinite universe, but it will all have to be done in small parts, chunks and/or raw materials, that you will reassemble on the other end, effectively recovering the full expressiveness of what turing-complete ships are able to do, but over a controlled sequence of individual containers possibly carried by an infinity of ships. That requires a lot more thinking and engineering ahead of time than just shipping things piecemeal, but the reward is that at least at the time the cargo arrives at the dock, you don't have to worry that it could be too big to handle, and there is always a solution to the questions of how to load the cargo and how long it will take to your dockers to do the job. Like in the case of turing-complete shippers, you can also handle the outsourcing business of other containers shippers and outsource yourself, but since everybody can forecast what they are going to ship because nobody accepts cargo that's not already been quantified, it's impossible in the total functional shipping business to get requirements like "I will ship through you what is being shipped through me" but rather requirements like "I will ship 159 containers". This forced determination in relationships prevents the occurrence of self-referent cargo and guarantees that capacity planning leads to results that are always consistent.

In this analogy, Tau-Chain is a container ship company and Ethereum a generalist ship company.

Tau-chain can tell to its client how much their cargo will cost to ship, how long its gonna take and when a single ship isn't enough and will use all and any container ship available of any size regardless and manage to dispatch all the cargo optimally. It can make all sorts of predictions on the shipping like checking that weight is well balanced, or that temperature in the containers remains within a certain range etc. Clients can attach to containers  fast automated procedures called proofs which took them quite some time to prepare but that will allow to clear automatically and very fast custom, security, and quality controls at the arrival point so that the cargo can be deployed right away to its intended use and used on the spot. Another interesting aspect of Tau-chain is that it finds its container system (Tau) so good that the company decided to eat its own dog food and sequence itself in a continuously evolving series of its own containers with custom procedures to maintain its own integrity as it evolves.

Ethereum on the other hand can't quite tell in advance just how big a ship will be needed for any specific cargo nor if it will fit at all in any ship, so what it does is to let the client decide themselves what size of ship they want to use (the client would typically simulate a dock in his backyard to try to predict what volume his cargo could take) and make them pay for the service in advance. When the cargo arrives at the dock, if it fits in the planned ship Ethereum sends the ship even if it's not full. And if the cargo doesn't fit, it's just thrown in the sea. Either way they keep the money. Cargo doesn't come with any sort of automated clearance test, so it's up to their intended users to figure if the cargo is correct and have it pass all clearance tests before they can use it safely (which is never entirely certain as the case of the DAO has shown).

I hope this analogy helps making these computing paradigms less abstract. There are many approximations and concepts that I had to stretch to makes them work with the analogy and are not really exact but that should give a rough idea of the differences, and how these affect the distributed computers that implement them.
Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 14/04/2016, 05:00:15 UTC
More important then that IMO is to enlist some more devs so no one precedes TAU achieving its goals.....in crypto you cant relay on 1 year from now....everything could happen
Tau is an open language with no strings attached. Why would someone want to fork it if they can get what they want much faster by simply contributing to its development?
It's like worrying that someone is going to fork Python. Could happen, but then you just end up with two languages with their own merits. The more the merrier.

Post
Topic
Board Announcements (Altcoins)
Re: Tau-Chain and Agoras Official Thread: Generalized P2P Network
by
klosure
on 05/04/2016, 12:06:48 UTC
I don't see any specification of this coin why.
There is no formal specification for Agora at the moment. The spec will likely be written directly in Tau logic when Tau-chain (the distributed plateform on which Agora will be running) is live. In the mean time, you can read the Tau whitepaper that covers the design of the Tau programming language and gives some hints of what the Tau-chain plateform will look like. You can also look at the Zennet thread (the earlier iteration of Agora) which gives a lot of technical details on the distributed computing model that Agora will be using. You will find a lot more information on IDNI, Tauchain and Zennet's websites as well as on this thread.

I cna see however the IPO is still open and anyone can still buy right?
The presale is momentarily suspended due to the theft of some of the presale funds.
The Agora tokens are currently being reissued as a new asset (asset id 58 on Omni)
Trading will resume some time this week or next. Follow this thread for more information.
Since the current asset tokens (asset id 35 on Omni) will soon be invalidated, do not purchase these tokens from anyone