Creating a first website in 10 steps

The creation of a site, if it is clearly planned and operated according to following steps, while starting on good bases will be likely to meet success with Net surfers.

1) Defining the type of site

Principal kinds of Web sites:

The type of site will determine the tools to be used, in particular if you intend to use a content manager to facilitate creation. Note that a forum or a directory can accompany a portal site, and that is even preferable.
The question is whether the site could have a growing audience, which is usually the goal of the webmaster.

2) The topic of the site and its public

For an optimal indexing by search engines and a better ranking in results, a site should be devoted to a single topic, well delimited. The first stage is thus the exact definition of to what the site will be devoted.
The public for which the site will be intended must also be well defined, before the creating.
In one site, multiple topics are not desirable at least at first. Conversely it is not desirable to create multiple sites on the same theme, a single bigger site is preferred.

3) Choosing a name

The name of the site can be allotted to a sub-domain which is a sub-repertory on the server, or a domain name like www.scriptol.com. Choose a name which has a value if possible (short, significant, often requested keyword).
Even if you have a free hosting with a sub-domain, it is preferable to deposit a domain name and to make it point on your sub-domain (for example, http://www.scriptol.com would be redirected on http://scriptet.sourceforge.net). However the benefit remains limited.

See at domain name or value of a domain name.

4) Choosing a host

The host preferably should support PHP 5 to run modern scripts. Hosting may have a CGI-BIN directory or not, variable bandwich, different statistical system, choose the one according to your needs. A dedicated server offer more freedom if you can manage a server.

How to choose a Web hosting.

5) Static pages or CMS?

It is more and more frquent to use a CMS or content management system like WordPress to create a website, and most often this is not necessary. The advantage of such a software is that it incorporates a comments system alongside other services such as blogroll and tag cloud.

These last two tools are less and less used because they may affect the ranking in the results pages of search engines which penalize excessive or artificial linking.

As for the comments, thanks to online services provided by Disqus or Facebook comments, they are now possible on static sites. Remain themes and other plugins that can be found on the CMSs. These advantages can become disadvantages when the CMS is updated but the themes or plugins are not!

Finally, it is easier to create a new page on a CMS, but when you want to edit it, it becomes rather more difficult to access. So we should use a CMS for e-commerce site or a blog, and not to other sites ...

To make a static site, it is enough to upload a single page in the space provided on the server, this page being named index.html.
Knowing the HTML syntax is not essential, you can write the page on an HTML editor (see list in links) and save it in HTML format.
A cross-platform ftp uploader such as Filezilla or PHP FTP Synchronizer (on this site) will allow you to put your work online.

6) Creating the contents

Ask yourself a series of questions to ensure that the site is not considered of poor quality by search engines such as by readers.

Google's style guide. The best is to start with good style rules. Google offers its advices.

7) Creating a more attractive site with images and CSS

To go beyond the textual site, and to offer to visitors fancy web pages, a specialized HTML editor becomes essential, accompanied by tools to generate logos and images.
The alt attribute to the img tag is a textual alternative to the image and is recommended. It contains a description of the image (and not of the page).

The use of Cascading style sheet will be also necessary for a really serious site. , for a reusable presentation of the page or to change the presentation according to the device: PC, mobile. It is better to use CSS at the start of the design.
A ready to use model is proposed on this site in the CSS section.

Here is a list of tool to create or generate images for a website.

8) Optimization for search engines

Referencing consists in making your site visible on the Net, which would be completely ignored of the public without that.
The first stage is the inscription on each principal search engine. Then, once the website has a sufficient content, to the Dmoz.org directory and some bookmark sites

For more precise details on referencing, consult the SEO tutorial and the SEO bible for a better ranking.

Meta-tags are tags whose contents are not displayed to visitors but which is taken into account by search engines. There is a quantity of them but only two are really important.

The <title> tag

It contains the title of the page. It is recommended by the HTML specification. This title must absolutely be different for each page. It should not be a list of keywords but a descriptive sentence. It should not start by the name of the site on each page.
Note that if the <title> tag is omitted, Google will use the first title marked out by <h1> </h1> or if not, H2, h3, etc.
However other software can need the <title> tag.

The description meta tag

It lets you choose the snippet, the descriptive text in pages of results and should be attractive. When it is omitted, engines use the first paragraph of content.

The robot tag (option)

Its format is as follows: <meta name= "robots" content="" > and it is placed in the <head> section too. It is intended for robots of search engines.
The contents are one to three words separated by commas:

Site map (option)

The sitemap is recommended by Google to help its robots in indexing. However, the majority of the reports which were submitted to me are negative, the creation and the upload of a map, even in the standard XML format of sitemaps.org, does not have result on indexing of the pages and results of search engines n most case. Sitemap is really useful when for some reason, all pages of the site are not indexed. If you wish create a sitemap, try a free software, Simple Map.

RSS Feed (option)

Building an RSS feed on your site to publicize the latest articles or selected pages is not difficult if you install the Free Online ARA RSS editor. It is a tool for promoting effective as explained in this article.

The robots.txt file (option)

It is not absolutely required, but allow to block certains parts of a site to search engines, such a page in work. A tutorial is given by Google: Getting started with robots.txt that provide the list of irectives to search engines robots.
Directives to crawlers may also be in web pages with the metas for robots or in the .htaccess file.

9) Enter the Web 2.0 with Javascript, PHP, Ajax

The definition of Web 2.0 explains what is this concept.
That consists of the use of the techniques of dynamic site described above, but also of the use of specialized libraries provided in particular by search engines, which are combined to create new services (mashup).
Sites exist that are based entirely on a Web service, using new technologies of Web 2.0, and they are the object of transactions in million or billion dollars.

Javascript

Scripting language which is embedded in the HTML code of Web pages. It makes it possible to modify its contents, for example to process forms. You may view on this page how to use Javascript and where to find tools.

PHP

PHP scripts are placed in a HTML page, but are executed on the server before the page is sent to browsers. A new page may be build according to instructions of scripts.
ASP is an alternative to PHP that runs only on a Windows server. Other language may be used also to the same purpose.

Ajax

It is the combination of Javascript, DOM, CSS and a server-side language like PHP or ASP. Using of specialized libraries (frameworks) facilitate the task by providing commonly used functions. But is is better to use either a simple library as Anaa or a more powerful Ajax Framework such as jQuery, the most commonly used.

10) Maintenance of your site

How much links towards your site for search engines? Once referencing achieved, after one month deadline, you will want to know the result. An estimate may be got from Google Webmaster Tools or Bing Webmaster Tools.
From time to time, once a month preferably, one must check if the links on external sites are always valid. That can be done automatically with the command link checker provided by this site.
If your site is created under Windows, to see how it look under another operating system may be useful. Using a Linux Live CD is the best way to see the site under another operating system at no cost.