Why Are We Giving Away Our Data So Easily?

I’ve asked myself this question before but, recently with everything that’s been going on in the world of Facebook and Cambridge Analytica.

Personal Opinion Disclaimer

It’s shocking really how quickly we are to give out our personal information.

Now, with the way the online world works. I feel that people should understand that with almost every company we interact with, we’ve openly given them some type of data about us. Arguably the most important part of this is WE CHOOSE to give them our information, they didn’t (in most cases) tricked into giving our personal information for free.

We’ve Been Conditioned Into Using Our Data

Over the past 8–10 years we have been conditioned by companies to use our personal information as gate keys into what they offer. Giving our data away didn’t come easy, it’s been a bit of a struggle for companies to get us to offer them our personal information for a “free service”. One of the first major software’s that comes to mind for me at is email.

In many ways, I think we can agree that some of our information must be given in order to provide a personalized experience, but there is a limit. But to what degree is too much data? After all, over the years we have become so used to giving our data away that what SHOULD be given has been blurred.

Going back to when email first came out we were so excited to use this new software because it allowed us to move away from all the faxing and the phones. More and more of us started creating various email accounts so we were able to connect with friends, family, and business.Down the road, this started to create consumer habits that in order to get access to anything we must provide them with our personal information.

It’s very interesting to see how something that is now considered simple like email. Has given many companies to thrive on our preconditioned habits that gave rise to today’s data behemoths like Facebook, Google, LinkedIn, and many more.

Why We Shouldn’t Give Our Data Away For Free

Every one of us whether we agree with it or not are giving out data away. Using data as a gateway to the many services we love could be great.

With any technology it will always with come risks, whats bad about giving our data is:

1. It Makes These Companies BIG Targets For Cyber Attacks

Every website you can imagine has more than likely been attacked by hackers or Dangerous AI that actively looks for loopholes in their system. One of the biggest reasons is because every day hundreds of BILLIONS of personal data transactions are being logged on many sites.

This leaves us and the companies we use open or breaches, this alone makes offering our data a massive risk.

After all, do you want to be one of the Millions of people every year that have there personal data stolen?

Example: The Lawsuit on Facebook & Cambridge Analytica, see more on that here

2. You Lose Control Of Who REALLY Has Your Personal Info

Regardless of whether we like it or not. When we chose to give a company our data we immediately lose control of what they REALLY do with it. Now granted many companies have controls to choose what data they actively pull, but you can’t ever completely remove the data or prevent them from tracking everything.

Because of this fact, it is very important for us to really think about what companies we are handing our personal information to so willy-nilly.

Whats Been Great About Using Our Data Access Products/Services

On the plus side using our data to access free products and services gives us access to tons of amazing solutions to our problems. In many ways giving out data has many positive situations where it drastically increases the opportunity for people and companies to connect with each other.

Some great examples that come to mind are:

1. Brings Rise To World Problems

Every country has issues, everyday people are dying because of drugs, violence and even the more nuanced problems that people are just unaware of. Having access to platforms like Facebook & Twitter give Billions of people the opportunity to bring up large issues and underlying causes to find ways others have solved them or are working through them. Without these services, it’s very likely that we would not be able to spread the word to huge issues such as the #MeToo movement and #BlackLivesMatter.

2. Allows Free Speech To Spread Where It Matters Most

The vast majority of people across the world are repressed and unable to express themselves as they should. Using platforms that offer their service for data gives many people a place in the world to connect and keep people connected.


Over the past few years, we all have been affected in some way by us giving out our data to companies. Just a few examples to recap, Targets hack of credit card info, Twitter accounts being hacked and lost of personal data, and just recently Facebook’s lawsuit because of the theft and sale of millions of peoples personal data to a third party company.

So this should at the least beg these questions…

  1. What company can I really trust with my data?
  2. Should we pay for the services we use that are currently “free”
  3. Should these large companies restrict the amount of data they offer to third parties?
  4. What regulations should be in place so my data is better protected?
  5. What can actually be done to protect my self and family from having my data stolen?

So what do you think about our data? Should it be kept more private and what do you think we should do about better protecting our selves online?

Let us know what you think, we would love to get your thoughts below.

The Old Ways Of Doing SEO Is Dead, Here’s Why

The Old Ways Of Doing SEO Is Dead, Here’s Why

Traditional SEO has a blind spot. For most companies, it’s handled by people more focused on developing your website. Why was SEO put in the IT department in the first place? Let’s dive into a little internet history. Up until July of 2002, the leading search engine was AltaVista, which served up pages based on keywords. SEO was largely about optimizing each page’s meta tags and titles with those keywords. So, it took HTML skills to do SEO.

Even after July 2002, when Google emerged as the leading search engine, most SEOs remained focused on optimizing web pages, because the ten blue links that appeared in Google’s search engine results still featured pages that included relevant keywords. Other factors, like link analysis, required building links from directories, widgets, or footers of various sites. In short, it remained technical enough to need IT.

Universal Search

Then, in May 2007, Google unveiled “universal search,” which blended results from more than just the web to provide the most relevant and useful results possible. In addition to web pages, for instance, the search results started to include video, news, images, and maps. But, in most organizations, the SEOs in IT departments weren’t responsible for creating video, news, pictures, or maps. And the people in marketing departments who were building most of this content didn’t think that SEO was their job. So, despite the fact that the era of 10 blue links had ended, a blind spot developed in SEO that still exists in most organizations to this day.

This is a missed opportunity because YouTube is the world’s second-largest search engine, and it uses metadata– your video’s title, tags, and description – to index your video correctly. To maximize your presence in YouTube search and suggested videos, as well as Google universal search results, video marketers should make sure that their metadata is well-optimized. This includes the title, description, and tags of each YouTube video. This isn’t the IT department’s job, and many video marketers haven’t learned how to do this.

Panda Update

In February 2012, Google introduced its first Panda Update to stop sites with poor quality content from working their way into Google’s top search results. In May 2012, the official Webmaster Central Blog provided some “guidance on building high-quality sites” that SEOs in IT departments – who don’t write articles – weren’t in a position to implement, while the people in marketing departments – with rare exceptions – didn’t think this guidance was for them. If you want to step into Google’s mindset, the questions below provide some guidance on how the search engine’s engineers were looking at the “quality” of an article:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow?
  • Does this article have spelling, stylistic, or factual errors?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the article describe both sides of a story?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?

These look like questions that someone who majored in journalism or communications can answer, not someone who majored in computer science. So, executives in most organizations need to ask themselves: If SEO now involves creating original, complete, comprehensive, insightful, and entertaining articles, then does it still belong in the IT department, or does SEO now belong in the marketing department?

Penguin Update

In April 2012, Google launched the first Penguin Update to do a better job catching sites that were obtaining them through link schemes. And Google provided some examples of link schemes which can negatively impact a site’s ranking in search results:

Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links, or sending someone a “free” product in exchange for them writing about it and including a link.
Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages exclusively for the sake of cross-linking.
Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links.
Using automated programs or services to create links to your site.

In other words, many of the tactics that have been used for years by SEOs in IT departments may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. After the Penguin Update, what are the bust practices for “building” links?  According to Google, “The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.”

In other words, link “building” once required lots of left-brained skills and competencies, but many of these are now considered link schemes. Since 2012, getting other sites to create high-quality, relevant links to yours requires lots of right-brained skills and competencies to create unique, relevant, useful, and valuable content. So, once again, executives in most organizations need to ask themselves: Does SEO still belong in the IT department, or does this team now belong in the marketing department?

RankBrain Algorithm

In October 2015, Google confirmed that it had launched the RankBrain algorithm, a machine learning artificial intelligence system. It is mainly used as a way to interpret the searches that people submit to find pages that might not have the exact words that were searched for. Google has said that RankBrain is the third most important factor in its ranking algorithm along with content and links.

According to an article by Danny Sullivan in Search Engine Land, here’s an example of a search where RankBrain is supposedly helping: “What’s the title of the consumer at the highest level of a food chain?”  To a layperson, “consumer” sounds like a reference to someone who buys something. However, it’s also a scientific term for something that consumes food. There are also levels of consumers in a food chain.

So, the name of the consumer at the highest level is “predator.”

So, this changes the way SEOs conduct keyword research. Instead of optimizing for popular short-tail terms that are one or two words long, SEOs need to optimize for more targeted long-tail terms that are three, four, five or more words long. That’s because they provide more context that helps you understand consumer intent – what consumers are looking for.

Speedy Update

Finally, Google has announced that it will start using page speed in mobile search ranking. Although speed has been used in ranking for some time, that signal was focused on desktop searches. In January 2018, the Official Google Webmaster Central Blog announced that “starting in July 2018, page speed will be a ranking factor for mobile searches. The “Speed Update,” as Google is calling it, will have a significant impact because more than 50 percent of search queries globally now come from mobile devices.

Among the resources that SEOs can use to evaluate a page’s performance is PageSpeed Insights, a tool that indicates how well a page performs on the Chrome UX Report and suggests performance optimizations. Typical optimization suggestions include:

  • Eliminate render-blocking JavaScript and CSS in above-the-fold content.
  • Leverage browser caching.
  • Enable compression.
  • Optimize images.
  • Reduce server response time.
  • Avoid landing page redirects.
  • Minify CSS.
  • Minify HTML.
  • Minify JavaScript.
  • Prioritize visible content.

Now, these require the kind of technical skills that are found in the IT departments of most organizations. So, even if responsibility for SEO is moved into the marketing department, the SEO team still needs a dotted line relationship with the IT department. Or, you can put people from both departments together to create an emergency task force to improve your mobile friendliness and page speed before July 2018.

SEO best practices in 2018

In summary, SEO best practices in 2018 are very different than they were back in 2002, 2007, 2011, 2012, or 2015. They include:

Optimize their YouTube videos, press releases, and images for Google Universal Search.
Create a satisfying amount of high-quality content for Google’s Panda Update.
Get other sites to create high-quality, relevant links to yours for Google’s Penguin Update.
Use keyword research to find relevant search terms for Google’s RankBrain algorithm.
Use PageSpeed Insights to improve the performance of a page for Google’s Speed Update.
In most organizations, the executives still haven’t received the memo that the era of 10 blue links ended a decade ago. This explains why so many organizations still aren’t using these SEO best practices. And whether executives decide to move their SEO team out of the IT department and into the marketing department or not, they still need to recognize that SEO isn’t something that you can set and forget