Creating a first website in 10 steps

The creation of a site, if it is clearly planned and operated according to following steps, while starting on good bases will be likely to meet success with Net surfers.

1) Defining the type of site

Principal kinds of Web sites:

The type of site will determine the tools to be used, in particular if you intend to use a content manager to facilitate creation. Note that a forum or a directory can accompany a portal site, and that is even preferable.
The question is whether the site could have a growing audience, which is usually the goal of the webmaster.

2) The topic of the site and its public

For an optimal indexing by search engines and a better ranking in results, a site should be devoted to a single topic, well delimited. The first stage is thus the exact definition of to what the site will be devoted.
The public for which the site will be intended must also be well defined, before the creating.
In one site, multiple topics are not desirable at least at first. Conversely it is not desirable to create multiple sites on the same theme, a single bigger site is preferred.

3) Choosing a name

The name of the site can be allotted to a sub-domain which is a sub-repertory on the server, or a domain name like Choose a name which has a value if possible (short, significant, often requested keyword).
Even if you have a free hosting with a sub-domain, it is preferable to deposit a domain name and to make it point on your sub-domain (for example, would be redirected on However the benefit remains limited.

See at domain name or value of a domain name.

4) Choosing a host

The host preferably should support PHP 5 to run modern scripts. Hosting may have a CGI-BIN directory or not, variable bandwich, different statistical system, choose the one according to your needs. A dedicated server offer more freedom if you can manage a server.

How to choose a Web hosting.

5) Static pages or CMS?

It is more and more frquent to use a CMS or content management system like WordPress to create a website, and most often this is not necessary. The advantage of such a software is that it incorporates a comments system alongside other services such as blogroll and tag cloud.

These last two tools are less and less used because they may affect the ranking in the results pages of search engines which penalize excessive or artificial linking.

As for the comments, thanks to online services provided by Disqus or Facebook comments, they are now possible on static sites. Remain themes and other plugins that can be found on the CMSs. These advantages can become disadvantages when the CMS is updated but the themes or plugins are not!

Finally, it is easier to create a new page on a CMS, but when you want to edit it, it becomes rather more difficult to access. So we should use a CMS for e-commerce site or a blog, and not to other sites ...

To make a static site, it is enough to upload a single page in the space provided on the server, this page being named index.html.
Knowing the HTML syntax is not essential, you can write the page on an HTML editor (see list in links) and save it in HTML format.
A cross-platform ftp uploader such as freeFTP or PHP FTP Synchronizer (on this site, open source) will allow you to put your work online.

6) Creating the contents

Ask yourself a series of questions to ensure that the site is not considered of poor quality by search engines such as by readers.

A summary of the list given by Google.

The article is it written by an expert or an enthusiastic with good knowledge about the subject?

Is there duplicate content, content overlapping or redundant content on the same topic, with small variations of keywords?

The article does it contain spelling errors, or bad style or cons-truths?

The topics are they genuinely chosen or to be better positioned in search results?

The article gives it an original information, a new story, its own research, a personal analysis?

The page gives it an added value compared to other pages on the same topic?

The site is it recognized as an authority on the topic?

The article is it well written or is it apparently hastily produced?

The article gives it a complete and exhaustive description of the topic it deals to?

Articles are they short, without substance or without interest?

The pages are they produced with great attention to detail or without worrying about details?

Google's style guide. The best is to start with good style rules. Google offers its advices.

To go beyond the textual site, and to offer to visitors fancy web pages, a specialized HTML editor becomes essential, accompanied by tools to generate logos and images.
The alt attribute to the img tag is a textual alternative to the image and is recommended. It contains a description of the image (and not of the page).

Here is a list of tool to create or generate images for a website.

404 not found page

7) Making of a 404 error page

When the server does not find the page requested by the browser, it returns an error code 404.
We must ensure that the user is not confronted with a such error message. To do this, a special page is created that will be displayed instead of the missing page. It remains to redirect the visitor to this page ...
Note that Wordpress and other CMS have error redirection already taken into account by the system.
Create or fill a .htaccess file with the following content:

RewriteEngine on 
ErrorDocument 404 /error.html 
This file should be at the root of the site, often into the subdirectory "www". You can download the archive with a ready to use 404 file.

The domain name must not be included in the URL because it impedes search engines. In effect this replaces the 404 code by a 200 code that is for a successful load. Google will refuse your sitemap if a full URL with domain name is used to redirect errors.

Instead of mocking the visitor (see the 4 examples of error pages on the right), you can instead build an intelligent 404 page. The idea is that the 404 page has a list of links related to the missing page that is requested by the user. This is easily done with keywords tools in Wordpress. Another good idea is to include a search box in the error page.

8) Enter the Web 2.0 with Javascript, CSS, Ajax

The definition of Web 2.0 explains what is this concept.
That consists of the use of the techniques of dynamic site described above, but also of the use of specialized libraries provided in particular by search engines, which are combined to create new services (mashup).
Sites exist that are based entirely on a Web service, using new technologies of Web 2.0, and they are the object of transactions in million or billion dollars.


Scripting language which is embedded in the HTML code of Web pages. It makes it possible to modify its contents, for example to process forms. You may view on this page how to use Javascript and where to find tools.


The use of Cascading style sheet will be also necessary for a really serious site, for a reusable presentation of the page or to change the presentation according to the device: PC, mobile. It is better to use CSS at the start of the design.
A ready to use model is proposed on this site in the CSS section.


PHP scripts are placed in a HTML page, but are executed on the server before the page is sent to browsers. A new page may be build according to instructions of scripts.
ASP is an alternative to PHP that runs only on a Windows server. Other language may be used also to the same purpose including JavaScript with Node.js.
With XAMPP, you can get PHP, Apache and MySQL working locally before hosting on a website.


It is the combination of Javascript, DOM, CSS and a server-side language like PHP or ASP. Using of specialized libraries (frameworks) facilitate the task by providing commonly used functions. But is is better to use either a simple library as Anaa or a more powerful Ajax Framework such as jQuery, the most commonly used.

9) Be ready for search engines

Referencing consists in making your site visible on the Net, which would be completely ignored of the public without that.
The first stage is the inscription on each principal search engine. Then, once the website has a sufficient content, to the directory and some bookmark sites.
For more precise details on referencing, consult the SEO tutorial.
Your pages must be ready to be indexed by engines. That depend on their structure and meta tags.


Meta-tags are tags whose contents are not displayed to visitors but which is taken into account by search engines. There is a quantity of them but only four are really important.

The <title> tag

It contains the title of the page. It is recommended by the HTML specification. This title must absolutely be different for each page. It should not be a list of keywords but a descriptive sentence. It should not start by the name of the site on each page.
Note that if the <title> tag is omitted, Google will use the first title marked out by <h1> </h1> or if not, H2, h3, etc.
However other software can need the <title> tag.

The description meta tag

It lets you choose the snippet, the descriptive text in pages of results and should be attractive. When it is omitted, engines build a snippet from the content of the page, often by using the first paragraph.

The viewport tag

<meta name=viewport content="width=device-width, initial-scale=1">

It informs browsers on mobiles that the page must be adapted to a smaller screen. This has no effect on the desktop.

This is accompanied by CSS rule for small screen: variable page width, variable separation link by a height of at least 22 pixels. (You can look at the bottom of the stylesheet of this page for an example).

The robot tag (option)

Its format is as follows: <meta name= "robots" content="" > and it is placed in the <head> section too. It is intended for robots of search engines.
The contents are one to three words separated by commas:

The canonical attribute

This is the fourth essential meta, it indicates what page should be indexed when multiple pages are identical. See the canonical script for more info.

Special files

Site map (option)

The sitemap is recommended by Google to help its robots in indexing. However, the majority of the reports which were submitted to me are negative, the creation and the upload of a map, even in the standard XML format of, does not have result on indexing of the pages and results of search engines n most case. Sitemap is really useful when for some reason, all pages of the site are not indexed. If you wish create a sitemap, try a free software, Simple Map.

RSS Feed (option)

Building an RSS feed on your site to publicize the latest articles or selected pages is not difficult if you install the Free Online ARA RSS editor. It is a tool for promoting effective as explained in this article.

The robots.txt file (option)

It is not absolutely required, but allow to block certains parts of a site to search engines, such a page in work. A tutorial is given by Google: Getting started with robots.txt that provide the list of irectives to search engines robots.
Directives to crawlers may also be in web pages with the metas for robots or in the .htaccess file.


It is said by Google, make a website for Internet users, not for search engines! But for search engines, only matter the text and the criteria that make your page deserves to be well ranked in results and the first is the set of significant words.
It is the choice of the webmaster to build a site around the article he want to write, or rather write articles that will most likely get a large hearing.
But whatever the strategy and level of involvement, all that engine see is a list of keywords.

We must therefore insert the most relevant to the page keywords and all the synonyms. Knowing that those who are at the beginning, in the title and subtitles are considered more important.

Adword Keyword Planner gives you ideas for groups of keywords from a topic or a starting word. As the tool is designed for advertisers of Adsense ads, the results matches what users are looking for.

10) Maintenance of your site

How much links towards your site for search engines? Once referencing achieved, after one month deadline, you will want to know the result. An estimate may be got from Google Webmaster Tools or Bing Webmaster Tools.
From time to time, once a month preferably, one must check if the links on external sites are always valid. That can be done automatically with the command link checker provided by this site.
If your site is created under Windows, to see how it look under another operating system may be useful. Using a Linux Live CD is the best way to see the site under another operating system at no cost.