Intelligent Endpoints A central aspect of the Internet, in contrast with other systems, such as the telephone exchange, is ostensibly the focus on ``smart'' endpoints, with the idea being that Internet nodes will largely be unintelligent, following simple rules, and the knowledge of mechanisms placed on the Internet should lie with the systems on either end of message reception. The Internet follows this only in its most base sense, as it has failed to enlighten protocols as computing has progressed; the smart endpoints are retarded. A very egregious example of this is how mere character streams are still most common. If instead of smart endpoints the Internet were intelligent endpoints, they'd be more able to learn. A system in which the network were where the intelligence lay would be worse in many ways, but would certainly make better use of its bandwidth and not be so difficult to teach new tricks; it's obscene that newer lower-level Internet protocols can be hindered by network segments having a whitelist for those protocols supported. So in one respect, the Internet hasn't been made suitably unintelligent. The WWW is a good resource for examples of wasting resources. I've so very often downloaded a large resource only to be required to download it again, be it due to the HTTP server declining to support partial downloading, the WWW browser deleting a valid copy downloaded with another tool, amongst yet other causes. The WWW can't be fairly criticized without also criticizing UNIX, because they're the same flavour of system, and many of these issues could be solved by proper file restoration, a means to coherently share system resources amongst programs, and yet other solutions UNIX doesn't provide. When I change a remote resource and seek to incorporate these changes, generally the entire resource will be transferred to overwrite that old, even for single-bit changes. Increasing the intelligence of both sides to use a delta format is an obvious solution, but usually doesn't get done. It's very easy to waste megabytes of sent data, because it's considered so plentiful that no effort is made to avoid unnecessary transfer. Mine Elision system is part of this basic realization that each side of the Internet can hold very large resources, and should decrease network traffic by doing so. Rather than instate a distributing system, or have caching mechanisms, simply take pains to send less data. The only reason the Internet works at all is because it's built to be a quagmire of incoherence. It now has the ability to perform heavy computation on each node, but this is primarily used for spying purposes. Compression can and should be used where feasible, but it's not an excuse for sending any data which shouldn't be sent anyway. It's excusable to lack partial downloading for resources which are very small and form messages, rather than a single large datum; this is also difficult to add in a reusable fashion for global usage, in a network with such a low-level idea of its purpose. Still, I know the Internet is unsuited to its purpose with its current protocols, and want to improve this. .