The concept of unique content is inextricably linked to the issue of duplicate content. Duplicate content, even the most inexperienced of internet marketers will know, is a big no-no – even if the exact rules (and what qualifies as duplicate content) are a little blurred. Yet everyone knows that duplicate content is to be avoided if at all possible, so the phrase ‘unique content’ gets bandied about as the savior
Yet what is duplicate content? It very much depends on your definition. For some, perhaps less experienced or professional, internet marketers, unique content is simply a catch-all term that means any content that won’t receive a flag in a program such as CopyScape. That can mean that the content of the article (such as “how to build a doll house”) is exactly the same, only the words of the article are rearranged to appear in a different order, and thus pass CopyScape
Is that really unique content though? Probably not. Unique content, in the truest meaning of the word, is something that is truly unique. Something that has been written from scratch, by a human being, which imparts a previously undiscovered knowledge or opinion. That is the true meaning of unique content, and it’s websites that feature this kind of information that rank well in search engines.
Of course, if you’re working within a much written about niche, it’s hard to keep finding groundbreaking information to refer to continually. So perhaps the truth of unique content is somewhere in between the two extremes; content that is freshly written for the specific website by a human being, that may cover old ground, but at least does it in a useful and well written way.
One of the major mistakes made by newcomers to the internet marketing world is to assume ‘content’ means nothing more than ‘keyword stuffed text’. Yes, we all know and appreciate the importance of keywords, and we all want to get them on to our websites in the cheapest and quickest way possible. Yet something that is continually overlooked is the importance of quality content, rather than just a load of text with a few keywords thrown in.
Quality content is determined as something that provides something to the reader. That might be news, it might be information, it might be help guides; basically, anything that leaves the reader more informed than they were prior to reading the content. And quality content, as the saying goes, is King.
The reason is simple. In the job description internet marketer, the key word is “marketer” – and it’s a darn sight easier to market a website that provides useful, quality information. With quality content, you can hope that other people will enjoy the content, and thus take some of the marketing out of your hands. Visitors may find the text of your site informative, and will link to the article on a forum you’ve never heard of. Suddenly, there’s a back link you would never normally have had, and all without you lifting a finger.
It may seem like keywords are all that matter if you want to be a successful internet marketer, but they’re not. Like most things, IM success is a balancing act. Yes, you need keywords, but those keywords need to be inserted in to high quality content for your IM career to really take off.
Search engine optimization is one of those odd, new Millennium techniques that has arrived right at the time when the average user can become an expert. Yes, you need to study SEO, but by and large the mechanics of it can be self-taught using internet forums and help guides. Yet there are companies that offer to perform SEO work for other people, and they manage to stay in business – how do they do that when it’s a skill that most people can learn? Maybe it isn’t all so easy after all…
People who are new to SEO may quickly consider themselves an expert. You can search Twitter, and will find a thousand profiles cheerfully insisting that the person running the account is an SEO expert. SEO Expert is not (like doctor, or dietitian) a legally protected term, so anyone can claim to be so. And many users may genuinely feel they are an expert, and that they have nothing to benefit from an SEO company. That’s the problem with new technologies.
If you are looking to launch a website to sell a product or service, you’re probably looking around the internet to see what you need to know. Eventually, the term SEO – and it’s importance – will crop up, along with the help guides telling you how to do it yourself. Doing it yourself is a lot cheaper than outsourcing to a company, so why should you both?
Put simply, no one can be an expert in a short period of time. The people who run SEO service companies really are experts, who have studied the art of SEO for long periods of time. You will always, unless you can spare several months to learn it all yourself, get better results with them.
They appear every so often on internet marketing forums; people claiming to have discovered a fool proof “black hat” search engine optimization technique. Their technique, available for a price, will propel your website to the top of the search engine listings – and of course they guarantee you’ll never get caught.
Now, think about it. While we’d all like to believe that there are methods that can get us to number one in Google with no effort whatsoever, it just isn’t true. Google is huge, and it’s smart. There’s no denying that those employing “black hat” (a phrase used to describe methods that go against Google, or other search engine, terms of service) techniques may experience success at first, but it won’t be long term. Not ever. In fact, there’ll be lucky if it works for a few days.
Let’s say these people, these forum peddlers, really had discovered a flawless technique to guarantee themselves top of the pile picks in search engine results. Do you think they’d be selling their method for a couple of bucks on forums? No, of course not. If their method really worked, they’d be creating small affiliate websites in every profitable niche, working their SEO black hat magic and sitting back to watch the profits roll in. Furthermore, the more they publicize their method, the more likely it is that Google will discover it – so why would they risk it?
They wouldn’t, because these methods don’t exist. Avoid them. Don’t waste money, both on purchasing the method and the subsequent building and use of method on a website, on something that is doomed to fail.
When it comes to search engine optimization, one of the most useful tools in a web developer’s arsenal is the < title > tags within HTML code. Unlike articles, which must be based around keywords (a procedure which is never easy), the < title > tag is a section of code which you can pack with your keywords – all without having to add a context, a readability, and all the other things that an article needs. The extra bonus is that you can have your main page have a < title > tag full of keywords, and keywords are often hard to find on a simple “welcome to this website” page.
The usefulness of the < title > tag is also one of its major problems. The tag becomes so powerful, so influential, and so easy to use, that those employing shady black hat SEO techniques quickly learn how to manipulate it. They discovered that by using more than one set of < title > and < / title > tags in an HTML code for a web page, they could fit in many more keywords – and thus rise up the search rankings. Using many sets of < title > tags is, understandably, known as title stacking.
Get caught doing it by search engines, and you’ll be dropped from the search results quicker than you can say ‘jack rabbit’. It might work for awhile, but the overall quality and reliability of your site will soon be called in to question – because you will get caught. Use one set of < title > tags only, and keep the keywords relevant to your site.
This may seem obvious; no search engine is going to rank you well in their search results if their bots discover that there is spyware, malware, viruses or any other kind of internet nasties contained within your website. In fact, if a bot does discover such content, your site will most likely be removed and blacklisted for good.
So that’s simple – and most of you won’t even be considering hosted that kind of content anyway, so there’s nothing to worry about, right? Perhaps wrong. Many sites are subject to hacking, which leads to them being infected with the nasties that search engines (and internet users in general, for that matter) hate so much. Even sites with thoroughly strong security can be hacked and infected, quite without the owner’s knowledge. So you could be merrily promoting your site, working on its content and ensuring your SEO is tip top, but you may not be aware that your site is infected and only a few steps away from being blacklisted forevermore.
There are a few things you can use to prevent it. The first is obvious, but crucial: visit your site regularly with your anti-virus working, and check it seems okay. Secondly, you can get a good idea of what other people think of your site by installing a Firefox Add-On called “Web of Trust”. This displays a ring of one of three colors near the browser menu of a website; green means the website is ‘safe’, orange means ‘doubtful’ and red means ‘avoid this site’. These colors are user generated, so you can check that no one is experiencing problems with your site by installing this add-on.
Have you ever heard the phrase ‘falling in with a bad crowd’? Well, if you link to websites that search engines consider ‘bad’, that’s the search engine optimization equivalent of falling in with a bad crowd. While your website may not be intrinsically ‘bad’ in itself, if you promote (by linking) sites that violate the terms and conditions of major search engines, you’ll be tarred with the same brush. While it’s unlikely your site will be completely blacklisted, you may see a sharp fall in rankings position – or be removed from the search rankings altogether.
This, of course, begs the question: how do I know what a ‘bad’ site is? After all, if someone links to you, you’re probably going to want to do the decent thing and return the favor That’s what so much of website building, networking and promotion is all about – right? So how can you be sure you’re not destroying your own search engine chances by linking to a poor site that search engines consider bad?
It’s tricky, but the basic answer is to use your gut. How does the website look? Does it look professionally designed, properly maintained? Is the content unique, or does it all sound familiar, or is the English terribly written?
On a more technical basis, you can check the PageRank of the site, and also its standings with Alexa. This should give a good understanding of the website in question’s general standing, and whether or not it’s the kind of crowd you want to be associating with. Also familiarize yourself with the Google terms of service, and scan the site for any obvious violations. If it passes, feel free to post a link back.
Among those well versed in internet marketing, duplicate content is something of a sticky issue. The exact nature of the problem is in what constitutes duplicate content, with some internet marketers insist anything that has previously been written on any other website qualifies as duplicate content – while others say it only matters for the same text to be repeated on the same website.
The exact definition is not exactly known, and isn’t helped by the fact that the search engines are not particularly forthcoming on the issue. However, if you are found to be using duplicate content on your website and a search engine does have an issue with it, you can kiss goodbye to a good ranking with that search engine.
It is more likely – though not certain – that the duplicate content rule applies to text used within the same site. You should not, for example, make lots of pages all using the same article with no changes. This is the lesser version of duplicate content, though some marketers still exist search engines frown on the same article or text being used from anywhere on the internet will trigger a duplicate content penalty.
The idea, of course, is to avoid plagiarism and for search engines to avoid publishing results that show the same text over and over again. To be absolutely sure you’re not committing the duplicate content sin, always write and use original content, both within your website and externally. That way, you can be sure – no matter who is right and wrong in the debate – that you aren’t going to be penalized for it.
All the major search engines compete to make their search results as relevant, up to date and informative as possible. For a search engine to be considered effective, and therefore gain users, it relies on its reputation for providing the right information for any given search term.
They’re right for assuming this. Imagine you were looking for some tips on how to clean your windows, and you used a search engine you’re unfamiliar with. If you visited a site through this new search engine, and it brought you to a website on adult porn – you wouldn’t be too happy, would you? In fact, you’d probably dismiss the search engine as useless, and wouldn’t bother to use it again.
That’s why search engines take an issue known as ‘cloaking’ so very seriously. If their livelihoods depend on the search results being accurate and informative, search engines have a duty to their own business ethics – as well as their customers – to frown upon cloaking, and they do. Do it, and your website will be removed from search results and most likely blacklisted.
So what is cloaking? Cloaking is the practice of writing a piece of programming that means human visitors to your website see something very different from what a search engine bot crawling your website sees. If you cloak effectively, you could indeed disguise your adult site as something as harmless as cleaning windows – and you’d benefit from a good SEO ranking. You’d also, unfortunately, ruin the search engine results – and they can’t be having that. When it comes to cloaking, avoid.
Anyone with a basic understanding of search engine optimization will know that text on a website plays a large part in how you are ranked in search engines. In fact, it could be argued that the textual content of a website is actually the most important thing for search engines.
It’s therefore natural for the cunning mind to wonder if it’s possible to introduce sections of ‘hidden text’. Imagine you’re not the best writer in the world, and you don’t want to have to spend a lot of money outsourcing content creation. Yet at the same time, you’re aware of the importance that search engines place on textual content. So rather than writing poor articles yourself, trying to jam your keywords in, you can simply write the keywords into a spare section of your website – and then changed the font color so it is the same, or virtually the same, as the background of the page. Suddenly, your website is stuffed with keywords, but all without having to publish poor articles or ruin the look and feel of your website in general.
This practice goes by a variety of names, including font matching and keyword stuffing. However, whatever you call it, it’s a bad deal.
Why? Well, the reason is obvious – it’s a cheat. Google, and the other major search engines, place an importance on text content because they want their search results to be relevant. Hidden text defeats the point of this, and if you’re caught doing it, you will have your website banned from the search engine – for good.