Pages

Friday 29 July 2011

HTML WYSIWYG Authoring Tools


In last few  years have seen an explosion in the web authoring tool market. WYSIWYG (what-you-see-is-what-you-get) HTML editors have graphical interfaces that make writing HTML more like using a word processing or page layout program. In the beginning, their goal was to spare authors from ever having to touch an HTML tag in the way that page layout programs protect designers from typing out PostScript. Today, their role has shifted towards making document production more efficient and automated while still providing access to the HTML
source.

Should You Use Them?


These days, nobody pretends that WYSIWYG authoring tools will excuse you from learning HTML completely, but they do provide a considerable head start for many menial tasks. Because these tools are notorious for adding extra code to HTML files, the question of whether or not to use them for web production has become
something of a holy war among web developers.


                  HTML purists insist that hand-writing HTML in a no-frills text editor is the only way to do it “right,” and that the HTML documents made by WYSIWYG tools are of unacceptable quality. On the other hand, many developers appreciate being spared the grunt-work of typing every HTML tag and find the WYSIWYG environment useful for viewing the page and making design decisions on the fly. Of course, there are many reasons both for and against using these tools. The controversy should lessen as the tools, which are currently in their infancy, work  out their kinks and start producing clean and robust code. If you do use a WYSIWYG tool, expect to do some manual fine-tuning to the resulting HTML code.


Pros



  •  They are good for beginners. They can even be useful for teaching HTML because you can lay out the page the way you want, then view the resulting HTML.
  •  They are good for quick prototyping. Design ideas can be tried out on the fly.
  •  They provide a good head start for creating complex tables and other advanced functions such as JavaScript and DHTML functions.


Cons



  • •They are infamous for not generating clean HTML documents. They add proprietary or redundant tags and often take circuitous routes to produce a desired effect. Some may even produce HTML that is incorrect.
  •  Some editors automatically change an HTML document when you open it in the program. They add their own tags and may strip out any tags they do not recognize.
  •  The built-in graphics-generating features do not offer much control over the quality or the file size of resulting graphics.
  •  Software releases tend to lag behind the quickly changing HTML standards, so the HTML you create using the tool may not be completely up-to-date.
  • They are expensive. The more powerful packages cost hundreds of dollars up front and additional costs to upgrade.



Tools and Technologies of web

Now a days, there are enormous tool presence both paid and free, to developed a web site.
Because of the awareness and popularity, many websites are created and maintain.Its because of the huge number of websites of , WWW consortium is switching from IP4 to IP6.

           Web is the simplest way to get the information from anywhere in world and almost about anything.
As there are tools for creating and maintaining the website, there are tools to search data from all over the world called search engine.

Here we will discussed some of these type of web technologies.

Web Language








   Web languages are the tools that help in designing the layout and internal logic of the web.The Design, logic and internal working if done through coding.some of the languages are HTML,  XML, javascript ,php, XHtml etc..


Search Engine



Search engines are the told that crawl  over the internet and index the data all over the network. When user search for some word through these tools, its look into the index data and provide the user with the most relevant link.
Some of the example of search engines are Google, Yahoo, bing etc..    

The Birth of the Web







In 1989,Tim Berners-Lee first proposal of World wide Web (WWW) was made by the CERN(European organization for nuclear research).

In the same year, a prototype software was demostrated.It was the CERN that developed the first browser although its simple.

 To encourage its adoption, an interface to the CERN Computer Centre's documentation, to the ‘help service’ and also to the familiar Usenet newsgroups was provided.

This NeXT Computer used by Tim Berners-Lee at CERN became the first web server

In starting, all the web servers were located in European physics laboratories, and very few people were givin the access to the NeXT platform which was used to run the first broswer.
Soon CERN promised to provide the very simple browser which can be run on any system.It was the starting of great era of human kind, Era of knowlwdge.Its the knowledge the can make a human more and more powerful.Then next mission for the CERN is to be globally presence of web so that knowledge from every part of the world would be made available.Communication would be ofcourse satellite for distant connectivity.This all reminded the prediction made by Arthur C. Clarke who said in May 1970 that satellite would one day "bring the accumulated knowledge of the world to your fingertips" and Today its fact.
The CERN datacenter in 2010 housing some www servers
It was in 1991 when an early WWW system was released.It include the simple browser, web server software and library,implementing the essential functions for developer to build their own software.A wide range of universities and research laboratories started to use it. A little later it was made generally available via the Internet, especially to the community of people working on hypertext systems.


The first Global journey of WWW started was came when first web server was installed in United states in december 1991 at a a pure research institute: the Stanford Linear Accelerator Center (SLAC) in California.


At this stage, there were essentially only two kinds of browser. One was the original development version, very sophisticated but only available on the NeXT machines. The other was the ‘line-mode’ browser, which was easy to install and run on any platform but limited in power and user-friendliness. It was clear that the small team at CERN could not do all the work needed to develop the system further, so Berners-Lee launched a plea via the Internet for other developers to join in.

Several individuals wrote browsers, mostly for the X-window system. The most notable from this era are MIDAS by Tony Johnson from SLAC, Viola by Pei Wei from O'Reilly, Erwise by the Finns from the Helsinki University of Technology.

Early in 1993, the National Center for Supercomputing Applications (NCSA) at the University of Illinois released a first version of their Mosaic browser. This software ran in the X Window System environment, popular in the research community, and offered friendly window-based interaction. Shortly afterwards the NCSA released versions also for the PC and Macintosh environments. The existence of reliable user-friendly browsers on these popular computers had an immediate impact on the spread of the WWW. The European Commission approved its first web project (WISE) at the end of the same year, with CERN as one of the partners. By late 1993 there were over 500 known web servers, and the WWW accounted for 1% of Internet traffic, which seemed a lot in those days! (The rest was remote access, e-mail and file transfer.) 1994 really was the ‘Year of the Web’. The world’s First International World Wide Web conference was held at CERN in May. It was attended by 400 users and developers, and was hailed as the ‘Woodstock of the Web’. As 1994 progressed, the Web stories got into all the media. A second conference, attended by 1300 people, was held in the US in October, organised by the NCSA and the already created the International WWW Conference Committee (IW3C2).
Graphic representation of a minute fraction of the WWW, demonstrating hyperlinks

By the end of 1994, the Web had 10,000 servers, of which 2,000 were commercial, and 10 million users. Traffic was equivalent to shipping the entire collected works of Shakespeare every second. The technology was continually extended to cater for new needs. Security and tools for e-commerce were the most important features soon to be added.

Open standards

An essential point was that the Web should remain an open standard for all to use and that no-one should lock it up into a proprietary system.
In this spirit, CERN submitted a proposal to the Commission of the European Union under the ESPRIT programme: ‘WebCore’. The goal of the project was an International Consortium, in collaboration with the US Massachusetts Institute of Technology (MIT). Berners-Lee officially left CERN at the end of 1994 to work on the Consortium from the MIT base. But with approval of the LHC project clearly in sight, it was decided that further Web development was an activity beyond the Laboratory’s primary mission. A new home for basic Web work was needed.

The European Commission turned to the French National Institute for Research in Computer Science and Controls (INRIA), to take over the role of CERN.

In January 1995, the International World Wide Web Consortium (W3C) was founded ‘to lead the World Wide Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability’.

By 2007 W3C, run jointly by MIT/LCS in the US, INRIA in France, and Keio University in Japan, had more than 430 member organizations from around the world.