Creating a first website in 10 steps
The creation of a site, if it is clearly planned and operated according to following steps, while starting on good bases will be likely to meet success with Net surfers.
1) Defining the type of site
Principal kinds of Web sites:
- Portal of a product or a company.
- Site of information based on a topic (a technology, a branch of industry).
- Educational site, containing tutorials or curses.
- Dedicated to a software.
- Sharing something: video, songs, etc.
The type of site will determine the tools to be used, in particular if you intend to use a content manager to facilitate creation. Note that a forum or a directory can accompany a portal site, and that is even preferable.
The question is whether the site could have a growing audience, which is usually the goal of the webmaster.
- An online store if it has products with descriptions similar to what is already on the Web, will be penalized by Panda.
If you also place links on many directories to make it known, he will be penalized by Penguin.
Its only chance of success is in selling unique products, but it is a gamble, as a large commercial site in the future may be interested in these products, and engines favor large sites.
- A blog or technical site, providing you talk about things that only you know can also be a success, if you are able to get interest from the public, which depends on your sense of writing.
- Sites of News and forums have no chance of success today unless they are dedicated to a product and the sole ones.
- Your main chance of success if you are able to program is in creating a service, provided it is unique. Your dependence to search engines will be minimal.
- It is still possible to find an original idea, see for examples a list of original sites in links at bottom.
2) The topic of the site and its public
For an optimal indexing by search engines and a better ranking in results, a site should be devoted to a single topic, well delimited. The first stage is thus the exact definition of to what the site will be devoted.
The public for which the site will be intended must also be well defined, before the creating.
In one site, multiple topics are not desirable at least at first. Conversely it is not desirable to create multiple sites on the same theme, a single bigger site is preferred.
3) Choosing a name
The name of the site can be allotted to a sub-domain which is a sub-repertory on the server, or a domain name like www.scriptol.com. Choose a name which has a value if possible (short, significant, often requested keyword).
Even if you have a free hosting with a sub-domain, it is preferable to deposit a domain name and to make it point on your sub-domain (for example, http://www.scriptol.com would be redirected on http://scriptet.sourceforge.net). However the benefit remains limited.
4) Choosing a host
The host preferably should support PHP 5 to run modern scripts. Hosting may have a CGI-BIN directory or not, variable bandwich, different statistical system, choose the one according to your needs. A dedicated server offer more freedom if you can manage a server.
5) Static pages or CMS?
It is more and more frquent to use a CMS or content management system like WordPress to create a website, and most often this is not necessary. The advantage of such a software is that it incorporates a comments system alongside other services such as blogroll and tag cloud.
These last two tools are less and less used because they may affect the ranking in the results pages of search engines which penalize excessive or artificial linking.
As for the comments, thanks to online services provided by Disqus or Facebook comments, they are now possible on static sites. Remain themes and other plugins that can be found on the CMSs. These advantages can become disadvantages when the CMS is updated but the themes or plugins are not!
Finally, it is easier to create a new page on a CMS, but when you want to edit it, it becomes rather more difficult to access. So we should use a CMS for e-commerce site or a blog, and not to other sites ...
To make a static site, it is enough to upload a single page in the space provided
on the server, this page being named index.html.
Knowing the HTML syntax is not essential, you can write the page on an HTML editor (see list in links) and save it in HTML format.
A cross-platform ftp uploader such as Filezilla or PHP FTP Synchronizer (on this site) will allow you to put your work online.
6) Creating the contents
Ask yourself a series of questions to ensure that the site is not considered of poor quality by search engines such as by readers.
- The content is it original? (A search on the title and subtitle may be useful).
Your site must be different, to add something more to what is already on the Web.
- Do users trust you when they read your page?
It depends on your references, but also details that you will bring.
- Do you give your visitors what they want?
It may be helpful to submit the site to others for advice.
- Do visitors want to read more page, to set bookmark, click the Like or +1 button?
It depends as much on the interest of the content and presentation, which justifies the next section.
Google's style guide. The best is to start with good style rules. Google offers its advices.
7) Creating a more attractive site with images
To go beyond the textual site, and to offer to visitors fancy web pages, a specialized HTML editor becomes essential, accompanied by tools to generate logos and images.
The alt attribute to the img tag is a textual alternative to the image and is recommended. It contains a description of the image (and not of the page).
The use of style sheets will be also necessary for a really serious site. A ready to use model is proposed on this site.
Here is a list of tool to create or generate images for a website.
8) Optimization for search engines and promotion
Referencing consists in making your site visible on the Net, which would be completely ignored of the public without that.
The first stage is the inscription on each principal search engine. Then, once the website has a sufficient content, to the Dmoz.org directory and some bookmark sites
These are tags whose contents are not displayed to visitors but which is taken into account by search engines. There is a quantity of them but only two are really important.
The <title> tag
It contains the title of the page. It is recommended by the HTML specification. This title must absolutely be different for each page. It should not be a list of keywords but a descriptive sentence. It should not start by the name of the site on each page.
Note that if the <title> tag is omitted, Google will use the first title marked out by <h1> </h1> or if not, H2, h3, etc.
However other software can need the <title> tag.
The description meta tag
It lets you choose the snippet, the descriptive text in pages of results and should be attractive. When it is omitted, engines use the first paragraph of content.
The robot tag
Its format is as follows: <meta name= "robots" content="" > and it is placed in the <head> section too. It is intended for robots of search engines.
The contents are one to three words separated by commas:
- Index, noindex: to index or not the page.
- follow, nofollow: to take into account the links in the page, or not.
- file, noarchive: to put out the page in cache on Google's server (and other search engines) or not.
- all= index, follow, archive.
- none= noindex, nofollow, noarchive.
- noodp: the ODP description is ignored. For site that are indexed in the Dmoz directory.
Automatic generation of metas
The SpiderSEO software is a tool to assist the webmaster in preparing webpages to referencing by creating meta tags automatically.
The sitemap is recommended by Google to help its robots in indexing. However, the majority of the reports which were submitted to me are negative, the creation and the upload of a map, even in the standard XML format of sitemaps.org, does not have result on indexing of the pages and results of search engines n most case. Sitemap is really useful when for some reason, all pages of the site are not indexed. If you wish create a sitemap, try a free software, Simple Map.
Building an RSS feed on your site to publicize the latest articles or selected pages is not difficult if you install the Free Online ARA RSS editor. It is a tool for promoting effective as explained in this article.
The robots.txt file
It is not absolutely required, but allow to block certains parts of a site to search engines, such a page in work. A tutorial is given by Google: Getting started with robots.txt that provide the list of irectives to search engines robots.
Directives to crawlers may also be in web pages with the metas for robots or in the .htaccess file.
of Web 2.0 explains what is this concept.
That consists of the use of the techniques of dynamic site described above, but also of the use of specialized libraries provided in particular by search engines, which are combined to create new services (mashup).
Sites exist that are based entirely on a Web service, using new technologies of Web 2.0, and they are the object of transactions in million or billion dollars.
PHP or ASP
PHP scripts are placed in a HTML page, but are executed on the server before the page is sent to browsers. A new page may be build according to instructions of scripts.
ASP is an alternative to PHP. It runs only on a Windows server.
Cascading style sheet, for the presentation of the page: defined once, used anywhere.
10) Maintenance of your site
How much links towards your site for search engines? Once referencing achieved, after one month deadline, you will want to know the result. An estimate may be got from Google Webmaster Tools or Bing Webmaster Tools.
From time to time, once a month preferably, one must check if the links on external sites are always valid. That can be done automatically with the command link checker provided by this site.
If your site is created under Windows, to see how it look under another operating system may be useful. Using a Linux Live CD is the best way to see the site under another operating system at no cost.