Much ofhe programming on the web is anachronistic. If you think about how most web developers work, they are indistinguishable from people editing word processor documents ca. 1983. Most edit text documents and the associated meta tags manually, and then apply some form of macro processing to convert their hand coded templates into a presentation document. The only difference between 1983 and now is back in '83 the presentation layer was a piece of paper, now it is a computer screen.
The state of the art back in 1983, however, was far different. Smalltalk was already a viable system (provided you could afford 2mb of RAM). In that environment you could edit your GUI in your GUI, and build tools in your tools. All features of every object was discoverable, as the source of the system was available at run time.
Fast forward to the late 80s / early 90s, and we have systems like Self in which you can build GUIs entirely via drag and drop. Programming too is entirely graphical, with each object being manipulated directly via a standard idiom. Self is what the web could have been, were it not for several anachronisms that became entrenched in how web developers do things.
Jump forward again to the late 90s and everything cool was done in Flash. In Flash, we had a VM with sufficient power to create crossplatform apps that ran reasonably well. The problem with Flash is the development model was based on the needs of the for profit company which built all of the tool chain. As a result, it never had the potential to be a platform where non-programmers could develop any familiarity for the system. Designers loved it because it worked similarly to their tools, but they are so specialized and technical that they rival programmers for esoteric knowledge and pain tolerance.
I hope we don't wait another 30 years.