How Can Your Company Website Become Effective for the Marketing of Your Products?
It is well known that a good company website is essential for successful marketing. There are many reasons for that:
- If you get visitors to your website – no matter in what way – and they get disappointed there, that may spoil plenty of opportunities. They will go away and look at something else. You must try to keep people happy.
- Your website can convey a certain image of your company – hopefully the desired image. People may get an impression e.g. concerning your technical competence, whether you care about their needs, how easy it is to communicate with you, etc.
- Particularly in a complex technological area like photonics, buyers need substantial information on products before they can make purchase decisions. This is particularly so for highly expensive products such as laser systems. If excessive uncertainties remain after studying product descriptions on a website, you may not gain the required trust that the offered solution is good – whether it is really good or not.
A good company website is also the required basis for many other online activities. If you managed to obtain many visitors with a large e-mail campaign, for example, but then lead them to a poor website, you could not profit much.
You may also be interested in my more general article on photonics marketing.
Good Websites Can Only Result From a Carefully Designed Process
A good and effective company website is not easy to produce. Well, you can get a specialized company to do that for you, beginning with an intense dialog on what the nature of your company and products is, what kind of image you want to convey, what kind of search engine ranking may be relevant for you etc. If they don't ask you such questions in the beginning, I will probably not be very helpful in the end.
I emphasize that to make clear that one should never primarily think about visual website design, as can be done by people in the graphics business. The essential point is to convey certain information with text and visual means – of course, information meaning more than just some facts, also including things like a general impression of a company and its qualities.
Therefore, the mentioned intense dialogue is essential. It involves a lot of thinking and ideally is also nurtured by external views. If you understand this, you will not be surprised to find out that the creation (or the later improvement) of a company website is often a major undertaking, consuming substantial resources like money and working time.
Many Details to Be Observed
A clearly defined and convincing concept is key for success, and that may take quite some creative work. Besides, there are numerous details which have to be taken care of – some examples:
- A website should be structured such that it is simple for users to find the relevant information. Avoid the feeling of “getting lost”.
- If the page structure is changed, one should make sure to avoid any “broken links”. Some automatic means should be installed and regularly used.
- If multiple people can make changes in the context of website maintenance, that must be properly coordinated.
- Nowadays, websites are often not visited with large screens of office computers and notebooks, but with various mobile devices like tablets and smartphones. Although I doubt that many people will use a smartphone when working on purchase decisions, you definitely want to have your pages valuable with a wide range of devices, including those having a relatively small screen. (Small screens can nowadays still have a huge number of pixels, but if small print can be read only by young people with splendid eyes, there is a problem!) So, you should have a responsive design, i.e., a web design which automatically adapts to the conditions of the viewer.
- Web servers should respond rapidly. Although computers are getting more powerful all the time, there are implementations which are so inefficient that the server response becomes sluggish – particularly when the server is overloaded with a lot of traffic. Note that most users are quickly getting impatient; the user experience can be substantially impaired.
- Websites can have serious security leaks, particularly if they allow postings and the upload of materials by the users. It takes substantial technical expertise to realize where possible hazards are located and how they can be cured.
How to Obtain Many Visitors?
When you produce your website, of course, you hope for many visitors. Besides, you want visitors who are relevant for your business – that's the quality of traffic.
The most important factors for that are the properties of the website itself (on-page optimization), although to some extent one can support the success with measures outside. The crucial factor for the success of the website is that is useful for the desired audience. If it is not useful, there is no chance, since nobody has a good reason to spend time on it.
Of course, you first have to define your desired audience and think about it:
- What kind of people do you really want to address? What are their interests and their needs? Which keywords do you think they will use when using search engines?
- Where are they located? Anywhere in the world, or e.g. only in the United States?
- What do they read? In particular, which websites are they using most?
- What could be a good reason for them to visit your site?
It takes some creativity – and often a lot of work as well – to reach the crucial goal of producing something useful for the intended audience. When I started the website of RP Photonics in 2004, I wondered why anybody might be keen to spend time on the website of a small new company; what useful content could be provided? Soon after, I came up with the central idea: offer a great encyclopedia, covering much of the area of photonics, at least my own field of expertise, in which I originally wanted to offer technical consulting and somewhat later also simulation and design software. That led to the creation of the RP Photonics Encyclopedia, which quickly became highly successful – so successful that it can nowadays be used as a great marketing tool not only for RP Photonics itself, but for many companies of the photonics community worldwide.
Obviously, for any other company it now takes some more creativity than just deciding to also offer a great photonics encyclopedia – because (a) you would be competing with a very well established one and (b) not everyone can produce something like that (or have it produced) with a reasonable effort. But it is already good if you can offer high-quality materials of perhaps more specialized nature on certain subjects related to your products. Always keep in mind, just shouting “my products are the greatest” will not do the job; you need to serve the readers, as otherwise they will go away.
You have probably heard about search engine optimization, and you may wonder how RP Photonics has managed to obtain such an outstanding search engine ranking. I can tell you what the essential point is: once again, offer highly useful content – in this case, articles covering much of the area of photonics which are so useful that readers come back again and again, and some of them also place links to this resource (apart from spreading messages via social media and other channels). In addition, one should avoid a couple of technical mistakes with which one could spoil the good effect of the content. However, search engines are nowadays using such refined algorithms that they can find out quite reliable whether the good content is, with little influence from tiny technical details.
So, essentially, search engines will do the job, but you can support them to some extent by properly handling important keywords in your texts. In particular, make sure that the most important keywords (possibly including some common variations) are occurring at least a few times on the relevant pages. Looking at web pages of competitors may give you some ideas for additional terms to be used.
It also matters where exactly certain keywords show up. It is particularly important that essential keywords appear in the title tag of the page. Also, it is good if they appear in headings. On the other hand, some meta tags which were originally meant to support search engines are nowadays largely ignored, since their content is just too easy to fake.
By the way, there are certain old SEO tricks which can easily backfire, since search engines will kill your ranking if they feel that you tried to cheat them.
It is possible that one part of the website, containing a lot of useful content and acquiring correspondingly strong traffic, becomes useful for other parts of the same website (or even for another website, to which it extensively links). For example, our software business profits substantially from the encyclopedia, largely because the encyclopedia contains illustrations made with that software. However, it is not automatically the case that one part of the website profits from other parts. There must be a meaningful relation to obtain that effect. The perhaps clearest example is the RP Photonics Buyer's Guide, which enormously profits from the RP Photonics Encyclopedia because all encyclopedia articles related to photonics products are directly linked to the corresponding buyer's guide pages with suppliers for those products (and vice versa).
Obviously, such effects are not results of “tricks”, which could be easily copied by competitors, but rather of natural relations which are based on a lot of diligent work.
External Measures for Supporting Website Success
How about the briefly mentioned external measures? In the early days, people paid for being included in various listings, some of which developed into little more than large link farms; forget about all this, since search engines are not stupid enough to be fooled that way. It may still help to get “organic links”, i.e., links which at least look like having resulted from personal judgments, but this is extremely cumbersome to get if you need many – and exactly for that reason still a good criterion for search engines: it is something which one cannot easily fake, but in most cases a good indicator for quality.
There are many agencies which essentially offer this generation of useful inbound links to other pages. For such purposes, they author (or purchase) articles containing links to your site, and then motivate other website owners to place those articles (often for a fee). That may work if it is done well,, but it seems that often it is just wasted money and time. For example, articles of poor quality are placed on websites which are not that relevant.
Basically, my advice is: forget about external measures for search engine optimization – just optimize your own place, and this is primarily concerning the content. Imagine, for example, how much time I would have spent convincing only 50 people personally that they should place a link on the RP Photonics website. Also, consider that the huge majority of the thousands of incoming links have been placed by people who just got convinced by the quality of the content – I think that is the way to go.
To avoid any misunderstanding: I am not saying that you should forget about external measures for any purpose. It is only that search engine optimization generally doesn't work well that way. It can still make a lot of sense to generate additional high quality inbound traffic, e.g. with enhanced buyer's guide entries.
Of course, you can still attract with visitors to your website with some additional measures, as discussed in the following. However, you should regard that just as a means to get additional traffic e.g. for special temporary things, not something to improve the general standing and ranking of your website. For example, buyer's guide listings – whether paid or free – are highly unlikely to improve your search engine ranking, whatever they tell you (although they may still be quite useful, but for other purposes). This is particularly obvious for buyer's guides which have a low ranking themselves. By the way, there are indeed cases where certain technical issues of such database-type websites diminish the chances for high search engine ratings. But those issues would probably be difficult to fix with a reasonable effort.
Some websites profit from content generated by their users. A classical example is to operate a forum, where users can enter questions and answers, and ideally engage in full discussions on a certain topic which is relevant for your website. Besides creating interesting information for yourself, that can also help to increase the search engine ranking because you get additional text, often with somewhat different use of important keywords, without writing it yourself.
Unfortunately, that approach has some serious disadvantages:
- If you let anyone post any content, you may soon find a lot of stuff on your site which you really don't want to see there – perhaps even legally problematic things, or content which offends other users. So, you have to regularly monitor the incoming content to quickly delete it in the case of problems, or better let the users register before they can enter text. You may even want to look through our entered content before it is published – but that obviously means a lot of work.
- You also have to be careful that there are no security loopholes on your website, which could e.g. allow people to insert some malicious code to steal data from your server, inject malware to the computers of your users (exploiting bugs in their browsers), etc. Truly horrible things can happen, and it is not easy to make sure that this is close to impossible.
Therefore, when should certainly not open a new web forum or something like that without first carefully thinking about the implementation and the involved work.
Security – Website Encryption and More
For a long time, encryption was used for the transmission of web pages only at a few places where clearly sensitive data are handled – for example, for online banking. Many people still think that encryption is useless, only wasting some resources, when used on websites like ours. After all, the content is public anyway, so why encrypt it on the way to the readers?
Well, there are some excellent reasons to do so. For example, non-encrypted data transmission may not only be read by third parties, but also manipulated. Imagine that a user is reading your pages on a notebook in a hotel, connected to the Internet via WLAN. The web traffic may be intercepted by some other guest in the hotel, who offers his own WLAN access point (pretending that it is one of the hotel) and gets your reader's computer to use it. All the user's website requests are properly forwarded to the targeted website, and their page content is sent back to him - but with some additional code injected. That way, they may inject some criminal malware into his computer, which subsequently steels credit card data or puts the computer into a farm of hacked computers to be used for other criminal activities like denial-of-service attacks. By the way, an encrypted WLAN connection does not prevent that kind of stuff.
That problem alone is definitely reason enough for us to protect the whole website traffic – not only that corresponding to certain web forms through which users send possibly confidential data to us. A reader's web traffic may still be intercepted by criminals, but they will see only encrypted content; the encryption goes all the way to the web server, not just to the WLAN access point, and the server's certificate should be checked by the browser. Well, there are still certain possibilities for man-in-middle attacks, but the risks are at least strongly reduced.
For such reasons, browsers will increasingly label non-encrypted websites as unsafe, and your search engine ranking will be hit if you don't employ comprehensive encryption. Fortunately, it is not that difficult to do for someone with a certain technical understanding, the cost is also not a problem, and the caused reduction of website performance (increased response times) is modest.
On many websites, it takes several seconds to load a page until one starts viewing and reading. In these seconds, a substantial fraction of potential readers may be lost already – they get impatient and return to the Google search page, for example, to find some other page on the topic. By the way, Google then also learns that people have quickly bounced back, and that will eventually reduce the search engine ranking of the page.
In 2021, Google has also started to use page speed metrics explicitly as inputs for determining search engine rankings. Many of the Google Chrome browsers determine certain speed metrics and communicate them to Google. From that, Google compiles detailed data called the Core Web Vitals. The technical details are often highly non-trivial; there are metrics like LCP (Largest Content Paint), FID (First Input Delay) and CLS (Accumulated Layout Shift), which depend in complicated ways on technical details of the website implementation, and to some extent also on details of the used browsers. I can tell from own experience that it can be rather difficult to track down the origin of certain penalties in these metrics so that you can address problems to improve user experience and your Google ranking. Well, it may be less critical for a simple photonics supplier website, but I definitely consider such things as essential for the RP Photonics website, which I am therefore regularly optimizing with great care.
Monitoring Website Traffic and Measuring Success
For many business activities, it can be vital to measure the success and then optimize your actions accordingly. In particular, that can be relevant for the creation and maintenance of a website, which (a) is a substantial undertaking and (b) can have strong effects on the success or failure of the company. Obviously, one should thus regularly monitor how it works.
An essential element is regularly analyzing the statistics of the web server. You should like to know, for example, the following:
- How does your overall web traffic develop over time? That can be measured in page views and perhaps also in terms of unique visitors (although the latter is difficult to get reliably, in particular when respecting privacy regulations).
- Which pages of your website attract most of the traffic (entry pages), and on which pages will people often stop browsing (exit pages)? Analyzing that may provide some good ideas for further improvements.
- Where do your visitors come from? In most cases, your web server will be told a “referrer address” when someone follows a link on another website to one of your pages. In your statistics, you should, for example, find out how many visitors you get from which domains. For example, you may get a referral from our page with suppliers for ultrafast lasers; the page name is www.rp-photonics.com/bg/buy_ultrafast_lasers.html, and therein you find the domain rp-photonics.com. You should be interested to know how many visitors you get from rp-photonics.com in comparison to other domains like photonics.com, laserfocusworld.com or optics.org, e.g. to estimate the value of advertising on different websites.
In larger companies, hopefully some IT person could extract such information for you. (Of course, any company must have at least one person who can do such things.) For example, ask for the referrer domain statistics. Usually, you can effortlessly find such statistics when logging into the control panel of your website hosting provider. This should give you at least some basic data – for example, for finding out what are the most important referrer domains. For more advanced statistics, companies can implement their own logging system. We at RP Photonics have done that not only to analyze data in more subtle ways, but also to do that such that European privacy law is strictly respected. Well, others often use tools like Google Analytics, which can give you nicely processed comprehensive statistical data, but do you really want that a company like Google can monitor every page view of your users? I certainly don't want that.
The questions mentioned above are just the basic ones; one can actually go much further:
- You may check the performance of certain important pages. As an example, consider a web page with which you hope to motivate people to do certain things like downloading a document, moving on to another web page, or perhaps even purchase a product by filling out a form. With a proper analysis of that traffic, you may find out which percentage of users do the desired actions (conversion rate) and what is the drop-off rate, i.e., the fraction of people just going away.
- Once you have acquired such data, you may identify pages which do not work well, think about what might be the problem, and try to improve them. If you are not certain, you may employ A/B testing: measure the success of two different versions of a page to find out which one works better.
Some people are really thrilled by such approaches, hoping that you can employ a kind of scientific method to systematically improve your pages. In some cases, that approach may truly work well, but there are also serious pitfalls:
- Measuring traffic flow might be easy, but measuring success is not exactly the same. It may work well if there is a clear indicator for success, such as a purchase done through a web form. However, lasers are not usually sold that way. In most cases, people will pick up the information, go away, possibly discuss with colleagues, then perhaps come back on some other day and place an order through a different channel, and it will be very hard to track down which influence a certain detail on a web page had.
- Traffic measurements are prone to various kinds of errors, for example statistical fluctuations (if the considered pages are not very popular).
- You may identify certain problems, but you still have to understand their precise nature and find practical solutions.
- You may be dealing with precise data, but normally, you also depend on certain interpretations of them, and at that point serious errors can easily spoil your results. For example, it is stupid to generally assume (as many seem to do) that a page with a “drop-off” rate obviously has a problem. It might also be that the page serves perfectly its purpose, and people will go away happily – possibly soon coming back for purchase. Only for certain types of pages, a high drop-off rate is really an indicator of a need to improve.
For such reasons, I am generally skeptical about “scientific” approaches to website optimization, particularly in regard to our own website. Being aware of many sources of errors, I would hesitate to invest substantial time into such matters and rather spend that time on careful thinking on what people need, what may work well etc. – essentially, a heuristic approach.
A frustrating conclusion of that article may be that things are unfortunately not that easy:
- Making a good and effective website is a big undertaking, where many mistakes can be done.
- Although there are plenty of tools promising to improve your search engine ranking, few of them really work – and those cause substantial work.
- You can acquire comprehensive data which you can use for improving further your website – but it is not easy to properly interpret the results and draw the right conclusions.
Nevertheless, having a good and effective website is vital for many companies. It is also the required basis for many other methods to work. For example, it is pointless to get many visitors directed to your website with external measures and disappoint them there. Therefore, I strongly recommend carefully optimizing your website in various ways, particularly concerning the content. Although that causes a lot of work, which may not bring quick results, it is at least something which can generate benefits for years.
My perhaps most important message is that the usefulness of the provided content is the probably most important aspect of all. That is how the RP Photonics website became enormously successful, and you can try the same, perhaps in a narrower subject area, perhaps covering some more than only the direct area of your products.
These sharing buttons are implemented in a privacy-friendly way!