Three Advanced Notification Features that Your Site Uptime Monitoring Vendor MUST Deliver
To say that site uptime vendors deliver notifications is about as insightful as saying that cars have steering wheels, planes have wings, or TikTok videos have cringe. It’s a given.
But this doesn’t mean that all vendors use the same notification playbook. Some vendors offer basic (read: superficial) notification features, while others offer advanced notification features that include:
Within minutes of unresponsive site behavior (and after verification that the issue is not a “false positive”), a designated individual in your organization — such as a sysadmin, network specialist, etc. — should be notified of the problem via email, text and/or automated phone call. The use of multiple notification methods increases the chances of a quick response.
If a detected failure continues, you have the ability to configure your monitor settings to notify even more members of your team, or other departments. This assures you that long-lasting events get all the eyes on them as needed.
There is more to site uptime than just availability. In other words: a site may be online and accessible, but certain processes within the site — such as those involving checkout, signing-in, customer portals, etc. — may be malfunctioning. You need to be notified of these issues as well, since they can be just as costly and damaging to your reputation as your entire site going down.
The Bottom Line
Site uptime notification is essentially about one thing: discovering issues BEFORE your customers and visitors, so that you can rapidly target and solve the problem(s). Choosing a vendor that checks both of these boxes is not only a good idea, considering the potential costs and consequences to your revenue and reputation, it is a mandatory move.
AlertBot’s advanced notification feature supports: multiple notification methods (email, text and phone), automatic notification escalation, and comprehensive site issue notification. Discover why leading organizations around the world choose AlertBot. Launch your free trial.
“We’ve been using AlertBot for over eight years now. We were sick of finding out about problems with our website from end users first. While there are varying levels of complexity to AlertBot monitors, even the simple alerts let us know almost instantly when we have an issue. The prioritization of alerting groups and timing allow us to automatically escalate the notifications if someone is not immediately able to respond.” – Chris C., IT Director
Read other verified customer reviews here!
]]>3 Ways Site Uptime Monitoring Boosts SEO
About 25 years ago, if someone told you to “Google” something, you’d probably smile, nod politely, and walk (or perhaps run) away. But now, Googling is the unofficial international pastime. Consider these statistics:
Clearly, the ability to show up for relevant search queries — a.k.a. search engine optimization (SEO) — matters enormously. In fact, it’s beyond enormous at this point. It’s ridiculous. And there’s no slowdown on the horizon. On the contrary, SEO will only play a bigger part in the digital role in the marketing mix going forward, for two simple and satisfying reasons: it’s much more affordable than conventional marketing and advertising, and it works. And you don’t need to have an MBA or have a Bloomberg terminal on your desk to know that affordable + works = popular. But less clear is the connection between site uptime monitoring and SEO. In fact, at first glance (and second and third as well), there may seem to be no connection at all. However, as any SEO expert worth their Google Search Console will attest, there is a significant link — positive or negative. Below we highlight three ways that site uptime monitoring can boost SEO:
Would-be visitors aren’t the only ones who are frustrated when sites are not accessible — Google takes a dim view of this as well. Now, to avoid triggering paranoia, be assured that Google has said that occasional, short-lived downtime typically won’t negatively impact search rankings. However, ongoing or prolonged downtime is another matter entirely, and will lead to a major downgrade. Site uptime monitoring automatically alerts your SysAdmins, CTOs, and other relevant individuals when a site goes down, so that immediate steps can be taken to get things back online — and make both visitors, and (especially) Google, happy.
Google wants to provide searchers with relevant and quality site recommendations. The first part of that equation is largely determined by elements like keyword optimization, page rank and domain authority. But the second is determined by what visitors actually experience once they arrive on a site. Site uptime monitoring helps you proactively identify broken elements like links and buttons, so that they can be fixed before Google’s web crawler notices them and starts handing out SEO citations.
For a long time, SEO experts demanded that Google reveal that page loading speed was a factor in evaluating sites — and consequently in search engine rankings. And for a long time, Google sat back with its arms crossed and silently smiled (when you make north of $300 billion in revenue a year, you get to do fun stuff like that). However, a couple of years ago Google finally revealed the worst kept secret in the SEO kingdom: speed is, indeed, a factor for search. Site uptime monitoring helps you keep a close eye on page loading times, so that you can ensure that your site blazes like a brand new luxury sedan on the Autobahn, and not like a rusted out 1984 Reliant K-car that shouldn’t go faster than a bike and can’t really make left turns. The Bottom Line Site uptime monitoring is not a magic wand that will transport your site (or sites) to the coveted number one spot for relevant keywords. But as discussed above, it will significantly help your business gain an advantage in the search engine jungle — which means more visibility, more clicks, and more customers.
Start your FREE TRIAL of AlertBot now, and discover why it is the trusted site uptime monitoring solution for some of the world’s biggest organizations. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.
]]>What is a HTTP 500 Error & How Can You Fix It?
One of the most valuable features of AlertBot’s web monitoring solution is that is automatically and continuously scans web pages for hundreds of possible errors, uniquely identifies them, and even captures a screenshot. Today, we’re going to take a deeper look at one of the many possible errors that AlertBot flags as part of its ongoing scans: HTTP 500 errors.
What is a HTTP 500 Error? An HTTP 500 error is an all-purpose error code. Basically, any error that doesn’t fit into an existing error code (e.g., 400: bad request, 403: forbidden, 404: not found, etc.), gets labeled as HTTP 500. Here’s what an HTTP 500 error looks like over at Google:
Potential Causes & How to Fix Them Here are some of the common triggers for an HTTP 500 error, along with possible solutions:
The Final Word Use the above strategies to help you pinpoint and eliminate HTTP 500 errors — ideally as quickly and easily as possible. You can also rely on AlertBot’s comprehensive failure reporting to get notified when and where HTTP 500 errors occur.
Start a free trial of AlertBot today. There is nothing to download or install, no billing information is required, and you will be 100% setup in minutes. Get started now: click here.
]]>
AlertBot: How did you get into robotics?
Matthew Vasquez: In the early 2000’s, when me and my brother were little kids, my Dad saw [BattleBots] on TV, and at that time, wasn’t even an engineer or anything like that, he was just kind of a hobbyist – good with tools – and he decided he wanted to try it. So, me and my brother were exposed to it from a super early age, and we loved it then and we still love it now.
Jason Vasquez: My family introduced me to robotics as a concept, and brought me to my first event, called RoboGames. And in that event, my first time I bought a one-pound robot that was a kit, and I learned a lot from it and obviously gave me the need to keep doing robotics. Through that event, we were able to prep ourselves for BattleBots. So, once BattleBots came back on the air, we were in a good position to apply and get our foot in the door, and it’s been great ever since.
AlertBot: How did you get started in BattleBots?
Matthew: Me and my brother really got started on the TV show BattleBots in 2015 when the show got rebooted on ABC, and then eventually switched over to the Discovery Channel. But around that time, when combat robotics wasn’t really on TV, we were just doing smaller combat robot events in Southern California, sometimes traveling to Northern California, and occasionally other states. When we saw the show was coming back to TV, it was so exciting and [we] wanted to apply. We wanted to get on the show and kind of live up to our childhood heroes. In 2015, we barely made the cut for the TV show and ever since then, we’ve been competing and it’s been a pretty life-changing experience!
AlertBot: Do you plan on staying involved with BattleBots?
Matthew: I think, as long as BattleBots is going, we want to be part of it in some way. We love competing. We love building. I love driving. I love the repair work. I pretty much love all of it! So, I think as long as BattleBots is around, we’re going to try our very best to be a part of it.
Jason: Yeah, whatever that may mean, I’d like to be involved in one way or another. It’s been great being on Whiplash and it’s been great having my own team. It’s a really great community and I’d like to stay involved in one way or another.
AlertBot: How did you come up with the name ‘Whiplash’?
Matthew: To this day, we’re not 100% sure. I was pretty convinced that I came up with it. There was another very unknown smaller robot named “Whiplash;” I really liked the robot, really liked the name, and I just kind of ended up using it for a different event that was not BattleBots, and then it kinda got carried into BattleBots. But we have other team members who are not convinced that it was me who came up with it…
Debbie Vasquez: Yeah, no, it was me. *laughter* It was me. I remember when I came up with it! I remember thinking I really liked “Backlash” back in the day in Comedy Central BattleBots days. And I was thinking “’Backlash.’ Alright, what else can we name it kind of like that?” And I was like, *Gasps* “Whiplash!” But… some people think otherwise.
Matthew: Yeah… *shaking his head* That’s not true. *Debbie laughs* But, whatever, it’s fine. We’ll never know!
Jason: Well, when people first asked us that, we’d like to joke around and say “Because Whiplash wins!” We chose that name [because] it’s a great name, I like it, and it’s been good ever since we chose it!
AlertBot: Is BattleBots a full-time job?
Matthew: Believe it or not, BattleBots is not a full-time job. Pretty much every competitor either works an engineering job, or some other job, or is a student, but BattleBots is not a profession. We go to our jobs for 8 hours a day, come home, work another 8 hours on our BattleBots and rinse, repeat when BattleBots season comes.
Jason: Well, during the two-plus weeks of filming, it is a full-time job, and up until the event with prepping and getting the robot ready, it certainly feels like it. We usually do it on top of school and our actual jobs, too. It’s a lot of time, but we just make the time for it and make it happen.
AlertBot: What do you do in the off season?
Matthew: I have other hobbies: I play tennis, play guitar and bass. My brother does a lot of mountain biking. But in the off season, there are also plenty of other combat robot tournaments going on. There are lots of local ones. Sometimes we travel out of state to go to different ones, but combat robotics is really an all-year-round sport. But it’s that few months a year where BattleBots really takes over our lives.
Jason: I used to be really big into biking, but right now I’m focusing on school and work and, honestly, other types of robotics. I’m really trying to expand my horizons and just continue learning about robotics. It’s great!
Thank you, AlertBot!
Watch the full interview on our YouTube channel below!
A Closer Look at AlertBot’s Email Reports
At AlertBot, we know our customers don’t want too much data about their websites and tasks. Instead, they want clear, organized, and reliable intelligence that tells them: what happened recently, what’s happening now, what’s likely to happen in the near future — and what they can do about it. That’s where email reports enter the story.
Here are the five sections in AlertBot’s email reports: Availability, Performance, Common Errors, Failure Events, and Confirmed Failures.
Availability
The Availability section of the email report displays the overall uptime of the websites that you are testing. Additionally, it is color coded.
Performance
The performance section of the email report provides details for the websites that you are testing. It displays a breakdown (measured in seconds) of each process, along with individual web pages that are associated with that process.
This is useful for daily website monitoring and studying long-term patterns to ensure their functionality. It helps in checking the performance of websites on a regular basis and analyzing trends over time. This is important to ensure that websites are functioning properly and meeting their objectives.
Common Errors
The email report’s common errors section shows all failures and transition errors that happened within a certain time. The list includes confirmed events, as well as those that are intermittent. Use this information to check for problems with websites or processes, or issues that need more investigating and analyzing.
Failure Events
Here, you will find a list of all confirmed failures (as indicated in the Common Errors section) for each hour in the past week. The failure events are also color coded:
Confirmed Failures
Finally, the confirmed failures section of the email report logs all problem areas. Notably, these have all been confirmed from a secondary location — i.e., they are actual failure events and not false positives.
With this in mind, there can be scenarios where confirmed failure events do not necessarily indicate a problem. For example, you may see that over the past week a website failed 10 times at 1:00am. However, after digging deeper you may discover that this is happening due to maintenance. If so, then you can simply set up a maintenance window.
The Final Word
In the 1990s flick Apollo 13, the big brains at NASA said that “failure is not an option.” Unfortunately, down here on earth, sometimes things in general — and websites and their related processes specifically — don’t work as expected.
Fortunately, that’s where AlertBot’s detailed, yet clear and focused, email reports make a transformative difference. It’s not just raw information. It’s actionable intelligence!
But what sets AlertBot apart is not just the information it provides, but how it presents it. Our reports are clear, concise, and focused, ensuring that you can quickly grasp the key insights without getting lost in a sea of technical jargon. We understand that not everyone is a tech expert, and that’s why we’ve made our reports accessible to all.
So why wait? Take control of your online presence and ensure that your website is running smoothly. Don’t let website issues hold you back – let AlertBot be your trusted companion in the digital realm.
Getting started with AlertBot is a breeze. With our free trial, you can experience the power of our email reports without any commitment. No need to download or install anything, and rest assured, we won’t ask for any billing information. In just a matter of minutes, you’ll be fully set up and ready to uncover the hidden potential of your website: click here.
]]>As you may have already discovered (or will soon encounter), many vendors that offer uptime monitoring solutions charge a setup fee. But instead of seeing this as a legitimate cost, you should view it as stop sign. Here are three reasons why:
#1: Set up…what exactly?
A site uptime monitoring solution should be fast and simple to set up. A vendor that wants to charge for this is revealing one of two things: 1) their solution is excessively complex; or 2) their solution isn’t excessively complex, but they’re trying to squeeze extra money out of you.
Either reason is unacceptable. If it’s the former, then you can count on plenty of hassles and headaches in the future. If it’s the latter, then ask yourself why you’d want to do business with a vendor that, from day one, is trying to deceive you.
#2: Transparency isn’t optional.
Nothing is wrong with a vendor that wants to raise their prices. It’s a free market. But what IS wrong, is when a vendor tries to hide this through a setup fee — which as noted above may be (and probably is) bogus to begin with. Basically, a vendor that tacks on a setup fee is trying to manipulate customers. After all, it’s not like customers have a choice: the setup fee is mandatory. So why not just integrate this amount into the overall price?
Transparency with customers should be a principle — not an option.
#3: Setup fees are probably just the beginning.
If you agree to pay a setup fee, the vendor will continuously ask for additional money. This will happen regardless of whether you agreed to it directly or indirectly. Of course, they won’t hit you with more setup fees down the road — because they don’t want you to wake up and realize that they have normalized something that isn’t normal. Instead, they will likely try and add some other mystery costs like “upgrade fees.”
These upgrade fees may seem harmless at first, but they can quickly add up and leave you questioning the true cost of the service. It’s like a never-ending cycle of hidden charges that you never signed up for. And let’s not forget about the dreaded “maintenance fees” that may conveniently pop up after a few months of using the service. It’s as if the vendor is constantly finding new ways to squeeze more money out of you.
But why should you have to deal with all these additional fees and hidden costs? Shouldn’t the price you initially agreed upon cover everything? It’s frustrating to think that you’re being taken advantage of, especially when you were promised transparency.
That’s why it’s crucial to carefully review any contract or agreement before committing to paying any setup fees. Don’t just skim through the fine print; take the time to understand what you’re getting into. Ask questions, seek clarification, and don’t be afraid to negotiate. Remember, you have the right to know exactly what you’re paying for and how much it will cost you in the long run.
If a vendor is not willing to be transparent about their pricing structure or tries to brush off your concerns, it might be a red flag. Trustworthy companies understand the importance of building a strong relationship with their customers, and that starts with being upfront about all costs involved.
So, the next time you come across a service that requires a setup fee, think twice before agreeing to it. Consider whether the vendor’s pricing practices align with your values and expectations. Don’t settle for hidden fees and surprise charges. Demand transparency and hold companies accountable for their pricing strategies. After all, you deserve to know exactly what you’re paying for without any unpleasant surprises along the way.
The Bottom Line
When evaluating site uptime monitoring vendors, naturally you will focus on things like features, functions, technology, and integrations. But you should also scan for setup fees (which may be called something else like “deployment fees” or “implementation fees” — they all mean the same thing).
Regardless of what a vendor might tell you: setup fees are not an “industry standard.” And we’re the proof! AlertBot is a top site uptime monitoring solution provider. We have never charged any setup fees. Our solution is remarkably easy to setup and configure, and our pricing is 100% transparent with absolutely no hidden costs.
Launch a free trial of AlertBot’s acclaimed site uptime monitoring solution. No credit card. Nothing to download or install. Get started in minutes. And if you decide to purchase our solution, rest assured there are NO setup fees!
]]>A Closer Look at AlertBot’s Alert Group Feature
If we start by sharing that AlertBot’s alert group feature lets you, well, alert certain groups, then you might wonder what earth-shattering revelations we have in store — such as water is wet, fire is hot, and the pain of Game of Throne’s final season will never, ever go away (seriously, whatever happened to Gendry?!).
Yes, you’re right: the alert group feature IS about alerting groups of people about a site failure — but as George R.R. Martin would say: there is much more to the story! Here’s a rundown of some interesting details that you may not be aware of:
Notes
When you set up an alert group, you can add notes if you feel that it would benefit your team. For example, you can let your Web Team know who the communication point person during a failure event should be or if it should include several people from the team, provide updates about vacation schedules, and anything else that you deem relevant.
Notification Order
You can choose when members of an alert group are notified of a site failure, from immediately all the way up to 48 hours later. For example, your Web Team can be alerted right away during a site failure event, and your CTO can be alerted 1 hour later into a site failure event and so on (if the problem persists). You can choose the frequency of alerting and how many times individuals or a group of people can be alerted during downtime events on your site.
Contact Method
You can also choose which email address will be contacted, based on the notification order. For example, an immediate alert can be sent to [email protected] and other teams/emails if selected, and then an hour later another alert can be sent to [email protected] and so on until the site is back up and running.
Monitors
What happens if you’re doing some testing or updating, and you don’t want failure events across all site monitors to trigger an alert (and maybe spark some anxiety)? No problem: you can choose which specific monitors are associated with an alert group.
But don’t worry: if you have a whole bunch of monitors and want to include them all, then you don’t have to manually add each one to an alert group. Simply select “All monitors in the account” and you’re good to go!
Do You Have 30 Seconds?
We’ve saved the best part for last: setting up a new alert group doesn’t take hours, or even minutes — it takes seconds. Simply choose the options you need, and you’re all set. And changing an alert group’s settings is just as fast (maybe even faster).
Try AlertBot Now
Reading is fun. But experiencing is better (unless you happen to be reading Game of Thrones and are perfectly happy learning about White Walkers vs. hanging out with them). Put AlertBot to the test by launching your free trial today. Play around with alert groups, along with many other features and functions.
There is nothing to download or install, no billing information is required, and you will be 100% setup in minutes. Get started now: click here.
]]>Unleashing the Web Guru: How Website Monitoring Boosts Traffic
by Louis Kingston
In the vast, mystical realm of the internet, where websites come to life and cat videos rule the land, there resides a hidden hero – Website Monitoring. Armed with lightning-fast reflexes and a vigilante’s keen eye, this unsung champion is the secret sauce to soaring traffic.
Picture this: your website is a thriving carnival, with merry-go-rounds of content and rollercoasters of creativity. But, alas, like an absent-minded wizard, you’ve forgotten to keep an eye on the gates. Enter Website Monitoring, the loyal gatekeeper who ensures no trolls sneak in to mess up your virtual fiesta. With a mischievous grin, it sends you real-time alerts the moment any gremlins try to mess with your website’s uptime. Your website’s downtime days are numbered!
Now, let’s journey into the realm of speed. In a world where every second counts, your website’s performance is its very heartbeat. But fret not, dear web adventurers, for Website Monitoring is the swiftest hare in the web-jungle. Armed with its trusty stopwatch, it tracks your page loading times like a hyperactive roadrunner, shouting, “Faster! Faster!” before your visitors can even say, “Are we there yet?” Voilà! Your website now zooms like a caffeine-fueled cheetah on the digital savannah.
Oh, but the fickle web travelers; they change their minds like chameleons change colors. Fear not, for Website Monitoring is here to unravel this enigma. With its mystical analytics, it becomes your crystal ball, revealing the mysteries of visitor preferences and behaviors. You’ll know what they like, what they loathe, and what they yearn for more than a lifetime supply of authentic New York style pizza. Armed with this newfound wisdom, you’ll sprinkle enchanting content like fairy dust, keeping your visitors spellbound and coming back for more.
Behold the battlefield of the mighty search engines, where websites engage in an epic struggle for visibility. But alas, valiant webmasters, Website Monitoring dons its armor of SEO prowess. It crawls through the darkest corners of the interwebs, sniffing out broken links and bad keywords like a digital bloodhound. Armed with this knowledge, you’ll climb the search engine ranks like a warrior scaling Mount Everest – and trust me, you won’t need oxygen!
In this whimsical tale of website wonders, we’ve unveiled the magical powers of Website Monitoring – the tireless protector of uptime, the guardian of speed, the oracle of analytics, and the knight of SEO. So, dear webmasters, heed this advice: with Website Monitoring by your side, you’ll wield the mighty sword of traffic-increase like a modern-day King Arthur.
Embrace the power of Website Monitoring and may your website’s journey throughout your site be filled with joy, triumphs, and an army of loyal visitors marching towards your digital domain!
Say goodbye to web nightmares and embrace the hero you deserve: AlertBot! Our supercharged website monitoring service is the ultimate sidekick you need to keep your online kingdom running smoothly. With AlertBot by your side, you’ll enjoy 24/7 vigilance, lightning-fast alerts, and more data than you can shake a unicorn horn at. So, what are you waiting for? Join the epic quest for flawless websites and unleash the power of AlertBot today – because even Gandalf would agree, “You shall not pass…without website monitoring!”
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>7 Deadly e-Commerce Checkout Sins
Back in the 1970s when bell bottoms roamed the world and 8-tracks reigned supreme, the Eagles warned us that Hotel California was a place where you could “checkout anytime you like, but you can never leave.”
Well, on the 21st century e-commerce landscape there is a similar dilemma facing customers who want to buy everything from gardening equipment to a new car: they can try to checkout anytime they like, but they can never buy.
Below, we highlight seven deadly e-commerce checkout sins that lead to lost sales and reputation damage:
Patience may be a virtue, but most customers aren’t in the mood to refine this noble characteristic when they’re ready to buy stuff. After all, they’ve already invested their valuable time choosing item(s). They want to cross this task off their to-do list right away. In fact, 70% of customers say that page speed impacts their willingness to buy from an online retailer.
People who buy things online are intelligent and savvy. But that doesn’t mean they want to feel as if they’re putting together IKEA furniture when going through the checkout process. They want the experience to be straightforward and simple. They just want to provide the required information — and nothing more. Less is definitely more.
A progress bar tells customers where they are in the checkout process (e.g. cart summary, sign-in, address, shipping, payment), so they know that things are headed towards a satisfying, successful conclusion. Without this information, they can get irritated if they expect the next screen to say “thank you for your purchase”, but is yet another form to fill out.
This one is tricky. Waiting for customers to get to the end of a form before telling them that they need to fix one or multiple fields can lead to an “I can’t be bothered with this, I’m getting out of here” reaction.
The best practice here is to configure form validation to scan and report as customers move from one field to another, or possibly one section to another (e.g. shipping address to payment information). Admittedly, some customers will still be irked by these “please fix the error” messages. But sending small notes as they move through the form/section is still better than forcing them to back up after they’ve reached the finish line.
For businesses, granular customer data can be far more valuable than an actual purchase. However, online sellers need to resist the temptation to force all customers to create an account before they can checkout. Otherwise, they are going to lose customers; not necessarily because those customers are reluctant to share their data, but because they just aren’t in the mood to pick a username and password, and then validate their email address.
With this in mind, sellers should provide incentives for customers to create an account by, for example, informing them that doing so will enable them to track order fulfilment, save time in the future, etc.
Customers hate discovering surprise costs at checkout. Ideally, sellers can avoid this problem entirely by having zero extra costs of any kind. But realistically, most sellers need to charge shipping/handling (at least until a threshold is met), and potentially other fees based on the item(s) being purchased, the location of the customer, and other factors.
The best way for sellers to deal with this is to make potential/inevitable extra costs explicit. Burying these details at the bottom of a page, and in font so tiny that customers need a telescope to read them, is more than worthy of a pair of Bad Idea Jeans.
Nothing screams “please don’t buy from us” louder than a checkout process where buttons, fields and other elements don’t work, or when customers are presented with a dreaded 404 Page Not Found (ironically, the funnier or more creative this page might be, the more incensed customers can get — as if the seller is shrugging off their pain and suffering). Using a solution like AlertBot to automatically and continuously test page integrity — and proactively send alerts when something goes wrong or doesn’t work — is an absolute must.
The Bottom Line
The e-commerce landscape is fiercely competitive, and it typically takes much less for online customers to head for the virtual exits than it does for in-store customers to head for the physical exits. Online sellers need to ensure that they aren’t committing any of the seven deadly — and wholly preventable — e-commerce sins described above. Otherwise, instead of fostering engaged customers, they will trigger outraged ones.
]]>Why Your Website Monitoring Solution Needs a Do-Not-Disturb Feature
It is so low-tech that Gen Z’ers and other digital natives may faint (or perhaps the avatar in a VR game that they are playing may faint) to learn that one of the greatest inventions in the history of our species is the humble do-not-disturb sign. Indeed, this magical placard is like having a very own private Gandalf shouting: YOU SHALL NOT PASS!
However, the glory of do-not-disturb is not limited to hotels, motels, and teenagers’ bedrooms. It is also a must-have feature in website monitoring solutions.
Why is a Do-Not-Disturb Feature So Important?
It does not take a Jeopardy! champion to know that do-not-disturb means (…wait for it…) “do-not-disturb” — which seems like the very last thing that organizations would want if there are site performance issues. On the contrary, the alarm bells via SMS, email and/or phone call should ring loud and clear. Or…maybe not.
In some cases, it makes perfect sense to pull individuals or teams off the notification list. For example:
What to Look For
A do-not-disturb feature is essential. But this does not mean that all website monitoring solutions that claim to offer this are in the same class. Here is what to look for:
The Bottom Line
Without a versatile do-not-disturb feature, members of your organization will be very disturbed — because at certain times, they will be alerted to website performance issues that they cannot and should not do anything about. This is a waste of time and resources, and can trigger confusion and chaos (and, let’s face it, it’s not great for blood pressure levels, either).
AlertBot’s website monitoring solution has a built-in do-not-disturb feature that checks ALL of the boxes described above. Learn more with a free trial. There is nothing to download and install, no billing information required, and you will be 100% setup in minutes. Get started now: click here.
]]>The year was 1995. Michael Jordan returned to the NBA. Amazon sold its first book. Windows 95 unleashed the era of taskbars, long filenames, and the recycle bin. And when people weren???t dancing the Macarena, they were flocking to see Apollo 13 and hear Tom Hanks utter the phrase that would launch millions of (mostly annoying) impersonations: ???Houston, we have a problem.???
Thankfully, the eggheads in space and the eggheads on the ground worked tirelessly (and apparently smoked a whole lot of cigarettes) to get the crew home. But it was the pivotal moment when the failure was first reported that triggered the spectacular problem-solving process. If it happened an hour ??? or maybe even a few minutes ??? later, then the outcome could have been tragic instead of triumphant.
Admittedly, the brave, intrepid professionals in charge of keeping their organization???s website online and functional DON???T have to deal with life-and-death scenarios. But they DO need to deal with problems that, if left unsolved, will significantly damage competitive advantage, brand reputation and sales (immediately if we???re talking e-commerce, and eventually if we aren???t). And that???s where AlertBot???s failure alerting feature enters the picture.
What is Failure Alerting?
Failure alerting is when designated individuals ??? such as a SysAdmin, CTO, CIO, CEO, and so on ??? are proactively notified when something goes wrong with a website, such as downtime, errors, slowness, or unresponsive behavior.
As a result, just like in Apollo 13, the right people can take swift, intelligent action to fix things before visitors/customers sound the alarm bell, or worse, head out the (virtual) door and go straight to a competitor without looking back.
Notification Methods
AlertBot customers can choose any or all of the following methods to notify team members of a website failure event:
For example, a SysAdmin could receive an email, a text message, and a phone call the moment something goes wrong.
Automatic Escalation
Now, if we were in NASA Mission Control circa 1970, someone wearing really thick horned-rimmed glasses would rise above the cigarette smoke and ask: What happens if the SysAdmin doesn???t receive the email, text message, and phone call? It???s a good question, and there is an even better answer: don???t worry about it.
AlertBot???s failure reporting feature can be configured to escalate the website failure warning if certain individuals don???t respond within a specific timeframe. For example, if a SysAdmin is indisposed for any reason (driving, sleeping, etc.), then after two minutes the alert can be pushed to another designated team member such as the CTO. And if the CTO doesn???t respond within two minutes, then the alert can be pushed to the CIO, and so on.
Ideally, the individual (or multiple individuals) who are sent the first alert receive it immediately, and take rapid action. But if they don???t or can???t, then the alert is escalated accordingly. It is important to note that all of this happens automatically, so there is no possibility of human error.
Granted, none of this is as entertaining as watching Apollo 13. There???s no rousing soundtrack or Tom Hanks. Heck, there???s not even Kevin Bacon.
But when it comes to fixing website problems as quickly as possible, organizations know that the less drama, the better. That???s precisely what AlertBot???s multi-channel, auto-escalating failure reporting feature delivers. We don???t need an Oscar. We just need extremely satisfied customers ??? and we have a lot of those.
Next Up: Reviewing Failure Events Online
??In our next blog, we???ll explore reviewing failure events online to pinpoint issues and detect problems. Stay tuned!
Launch a free trial??of AlertBot???s acclaimed site uptime monitoring solution. No credit card. Nothing to download. Get started in minutes. And if you decide to purchase our solution, there are NO setup fees!
]]>Sometimes it just makes too much sense. When the opportunity arose for AlertBot to sponsor one of the highly talented teams in the BattleBots tournament, it just seemed like a no-brainer. (I mean, come on — we’re AlertBOT… it’s a match made in robotic heaven!) In this case, we were able to be among the select sponsors for team Whiplash, a much-celebrated family-run team that regularly competes in BattleBots. As part of the sponsorship, the Whiplash gang invited us to witness the filming of the latest BattleBots season in Las Vegas, Nevada, and it didn’t take much convincing for us to start booking our trip to Sin City.
A pair of us from the AlertBot team flew out to Vegas to meet the Vasquez family – AKA collectively known as Whiplash – on Monday, October 17th, 2022, to get a personal tour of the facilities. We met with Whiplash’s Debbie Vasquez (Whiplash Team Manager), who graciously showed us around the BattleBots pit area, and was an absolute delight to talk to. She even introduced us to other teams that we could speak with and see their bots prior to the fights. We met with teams that traveled as far as Australia (DeathRoll) to be here for the filming of the show. We enjoyed meeting the entire Whiplash team, which included Matthew Vasquez (Whiplash Team Captain, Designer, Builder and Driver), Jason Vasquez (Whiplash Builder, Auxiliary Weapons Operator, Pit Crew ), Jeff Vasquez (Whiplash Team Builder, Pit Crew), Debbie Vasquez (Whiplash Team Manager) and others on their team. They were all just like you see them on TV and a pleasure to be around.
2021 marked the first year that a new BattleBots arena building was set up to be a permanent hub for BattleBots tournaments. Next to the main arena building is a small collection of tents for various specialties dedicated to the needs of the BattleBots teams. Right alongside the arena is a designated welding area, where Lincoln Electric is set up to assist the teams in working on — or fixing — their respective bots. On the other side of these small tents is the main pit area tent, where one would find every single team set up inside with individual workstations for each team. It looked very much like a tradeshow with tables promoting the teams or selling merch. However, these are quite literally stations where the teams feverishly work on their bots — whether setting them up for their first fight or rebuilding them after a particularly violent encounter. Each team’s work area was also graced with a widescreen TV so they could watch the fights live while working, keeping the builders in the loop as to the progress of the new season. The hope and excitement in that pit area on the eve of the first day of filming the new season was palpable. Sadly, while each match would result in a winner, there must also be a loser.
We were amazed by the goodwill between the teams, too. You might expect there to be a cutthroat competitive nature between them, but instead, there was a shocking amount of love and admiration shared among the teams. By the way they behaved, you would think they were all on the same team together. It was hard to imagine these teams remaining friends after one might totally debilitate or demolish the bot of another. But somehow, they do. Still, it was impossible not to notice the passion, detail, and effort that went into each bot. Each team had immense hope of success with their bots, and you almost couldn’t imagine their hard work resulting in utter heartbreak.
The following day, we arrived early to make it through the front gate check-in area and join the VIP’s in finding a place to sit inside the arena in the audience on the bleachers. Each taping session is 4 hours long, and each day includes 2 of these recording sessions, with a 2-hour break between them. Fans can buy tickets to any of these sessions (pending ticket availability, of course) online, so they could attend one of these sessions, or both if they desired. We attended both the morning and the afternoon sessions that first day, with a set number of fights occurring in each session and extras squeezed in if possible.
Fans were expected to be very impassioned and involved in each taping session and were often instructed to cheer at specific times. Granted, you don’t have to tell these fans to be excited; they just naturally were. But for taping reasons, there needed to be specific moments of cheering and reactions from the fans to make the event appear smooth for the episodes that would air.
Everyone you’d expect to be in attendance at a BattleBots taping was indeed there. Announcers Chris Rose and Kenny Florian were there to offer their pre- and post-fight announcer commentary, and Faruq Tauheed was there to announce each fight (or, in some cases, re-announce the fight, if he or the producers needed a different take from him). The judges, who would clarify any close-call fights were also on the other side of the arena cage, and we’d learn of their final verdict when Faruq made his official announcement.
For the audience, comedian Bill Dwyer, who was the host of the show during its first iteration in 2000 and 2001, played hype man to the audience, and was just a lot of fun. He interacted with us on a personal level, as well as getting the younger fans engaged (and often rewarding them with free t-shirts and such). He would fill in the downtime between fights, which helped some of the slower moments pass by more quickly.
Members of the individual bot teams also would frequently run over to the stands and hand out signs or stickers to fans to enjoy or hold up during their fight to cheer them on. It was a neat little bonus for being there in person.
A given fight would start with Faruq’s announcement, the teams walking out (and posing), and their bots being wheeled into the arena “battlebox” on hydraulic carts. After setup, the countdown would begin, and the bots would go at each other for the win. Each fight is given 3 minutes total to play out, which were easily the most exciting minutes of the day, but some fights didn’t last even half that time. A fight would end early if one bot rendered the other undriveable, but other fights would last the full three minutes and then go to the judges to make the final call as to who the winner would be. In most of those cases, the winner would still be chosen “unanimously” across all the judges.
The fights were all pretty exciting. One match ended after about 20 or 30 seconds with a super quick KO, while a couple others needed the full time to complete. One particular fight ended with a bot catching on fire and it would take some time for the arena to be cleared and readied up for the next fight. In the second session, a pair of bots got stuck together after less than 30 seconds of fighting, and after quite some time trying to get them apart, they were cut apart and taken out of the arena for the next fight to commence. There was definitely no shortage of memorable moments during a full day of filming!
When we left Vegas for home, we took along with us a new appreciation for BattleBots and their talented teams. It’s a sport that appreciates its fans and has a surprising amount of heart on and off camera (especially off camera). We only witnessed a handful of the fights that will be televised next year, but you can be sure we’ll be tuning in to watch these teams go head-to-head for the championship! Fans can tune in on Thursday’s at 8pm (check your local listings) to see the new season of BattleBots on The Discovery Channel. Go, Whiplash!
]]>Hey friends! You can follow AlertBot on social media channels at the following links:
X (Twitter): https://www.twitter.com/AlertBot
Facebook: https://www.facebook.com/AlertBot/
Instagram: https://www.instagram.com/alertbot/
Threads: https://threads.net/alertbot
YouTube: https://www.youtube.com/AlertBot/
LinkedIn: https://www.linkedin.com/company/alertbot/
What is Proactive ScriptAssist and Why is it a Game-Changer?
Sometimes — not often, but every now and then — we come across an invention that is so remarkably useful, that we wonder: how did I survive without this?
High speed internet comes to mind. So do GPS devices. And who wants to imagine a world without the cronut?
Well, it’s time to add one more invention to the list: Proactive ScriptAssist.
The Back Story
Websites are not static things. They change over time; sometimes in minor ways, and other times in major ways (for fun, check out the Internet Archive’s Wayback Machine to see what some of your favorite websites looked like in the past — like Apple’s home page from 1996 which invites folks to learn about “the future of the Macintosh”).
Now, for visitors, the fact that websites constantly change is not a problem. In fact, it’s often a good thing because the change is an update, addition, or improvement of some kind.
But for IT and InfoSec professionals who are in charge of (among other things) website monitoring in their company, these changes can — and often do — trigger all kinds of bugs and errors. Fields and forms stop working, elements stop loading (or they load v..e..r..y….s..l..o..w..l..y), and there can be security vulnerabilities as well.
Multi-Step Monitoring
Thankfully, there is a way to verify that everything is working before site visitors start sounding the alarm bells — or worse, disappearing never to return.
This method is to implement an easy-to-use web recorder to create scripts of what site visitors actually/ typically do on various web pages, and make sure that everything is working properly. This is highly effective. That’s the good news.
The not-so-good news, is that when changes occur — even fairly small ones — re-scripting monitors can be a complex process that, in some scenarios, may require a level of expertise and experience that some IT/InfoSec professionals don’t have.
What’s the solution to this obstacle? Let’s all say it together: Proactive ScriptAssist!
About Proactive ScriptAssist
Available EXCLUSIVELY from AlertBot, Proactive Script Assist is an optional plan that includes the following:
Plus, if needed our team offers advanced support over remote desktop sessions (join.me sessions). This is not always necessary, but it is another layer of help just in case.
The Bottom Line
Inventions that changed our lives: High speed internet. GPS. Cronuts. And now, AlertBot’s Proactive ScriptAssist. It’s an elite list, and one that we’re honored to join.
Learn More
Ready to make your IT/InfoSec teams weep with joy (which is nothing like the weeping they did that time the intern wiped out the backup)?
If you’re a current AlertBot customer, then contact your Account Manager today.
If you haven’t yet experienced AlertBot, then start your free trial today. You’ll be setup in minutes. No billing information, nothing to install, and no hassle.
Now, if you’ll excuse us, we’re going to read about the future of the Macintosh while enjoying a cronut or two (or 5).
]]>Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think!
Have you ever watched a movie (*cough* Godfather III) and said to yourself: “wow, this is so incredibly bad — I don’t think this can get worse!” But then it does. Much, much worse.
Well, having a down, slow, or dysfunctional website is similarly nightmarish — just when you think the reputation devastation is finally over, there’s more on the horizon. With apologies to Shakespeare: hell hath no fury like a customer scorned.
Not convinced? Here’s what happens to companies that get on the wrong side of their customers:
Scary stuff, huh? “But wait — there’s more!”
These days, many unhappy customers publish reviews to punish companies that fail to meet their expectations. But guess what? These eviscerating appraisals are not just seen by other potential customers (many of whom quickly decide not to move into becoming actual customers). They are also seen by potential job candidates who are not enthusiastic about joining an organization that is used as target practice by denizens of the interwebs (everyone from THE ALL CAPS BRIGADE to the “tl;dr” force to the League of Extraordinary Grammarians).
However, just as all nightmares eventually come to an end (hey, even Godfather III mercifully rolls credits at the 2-hour-42-minute mark), there is something that companies can do to dial back — or better yet, eliminate — customer outrage caused by a down, slow, or dysfunctional website: get AlertBot.
AlertBot’s fully integrated monitoring platform monitors all your websites, web applications, mobile sites and services — all in one place. Unlike many other products in the marketplace, AlertBot doesn’t merely monitor a URL’s basic availability. It dives much deeper and monitors full page functionality using real web browsers in order to verify every page element, script, and interactive feature. As a result, you can proactively scan for errors, track and optimize load times, pinpoint issues, and get alerted to problems and failures.
The bottom line? A down, slow, or dysfunctional website can be so catastrophic that it makes Godfather III look like, well, Godfather I or Godfather II. Don’t hope for an Oscar just to win a Razzie. Get AlertBot and inspire your target audience to cheer vs. churn.
Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes.
]]>It is arguably the most important 3-letter acronym on the digital marketing landscape. No, it’s not ROI. It’s SEO. Consider that:
Clearly, effective SEO is extremely important. And for many businesses — especially smaller companies that are competing against big, established enterprises — it’s a matter of survival. However, for some decision-makers outside of the digital marketing world, the link between SEO and site uptime is less clear. Let’s fix that.
For Search Engines, it’s All About Relevance
Realtors like to point out that the three most important factors in evaluating a property are: location, location, and location. Well, the big brains behind search engines like Google and (to a lesser extent) Bing and Yahoo are obsessed with: relevance, relevance, and relevance.
What this means, is that when responding to a search query — anything from “tennis rackets” to “what’s this itchy red bump on my foot?” — search engines strive to produce results that will be seen by searchers as relevant. Otherwise, eventually searchers will switch search engine brands (e.g. leaving Google and using Bing). Relevance is the glue that keeps the relationship sticky. And unlike with those glorious model airplanes that many of us failed to create when we were kids, in this case, the more glue the better.
Downtime Damages Relevance
Since search engines strive to deliver relevant search results (and therefore positive user experience), it makes sense that downtime — which can be defined as a site being inaccessible or outright disappearing — is the enemy.
After all, if a searcher looking to buy a tennis racket clicks a site and discovers that it’s unavailable, then they won’t just punish the company that they hoped to engage: they will, in time, punish the search engine that pointed them in that direction. That fear keeps search engine folks awake at night (including mighty Google which commands more than 90% of the desktop and mobile search marketplace), and it explains why downtime is such a threat: it damages relevance.
Is 100% Uptime Absolutely Vital?
This warning about downtime begs an important question: do companies that want to stay far, far away from Google’s, Bing’s and Yahoo’s penalty box have to ensure 100% uptime? Not necessarily. While uninterrupted availability is ideal, it is not realistic. Occasionally, a site will go down for a few seconds or perhaps even longer. There are a variety of reasons for this, such as problems with a web host, an unexpected spike in traffic, and ol’ fashioned human error (hey, we all make mistaks…er, mistakes).
However, the top priority for all businesses that want to win the SEO game must be to minimize site downtime in terms of both frequency and duration. They also need to know why site downtime occurs, in order to proactively address issues and keep them from recurring. And that is where site uptime monitoring enters the picture.
What to Look for in Site Uptime Monitoring
There are many site uptime monitoring products in the marketplace, ranging from superficial (and usually free — hey, we get what we pay for), to robust and reliable. Obviously, organizations need to choose from among the latter and avoid the former. To that end, here is what to look for in a site uptime monitoring solution:
And it goes without saying: a legitimate and reliable site uptime monitoring solution must be backed by a responsive team of experts who will immediately take ownership of an issue and see it through to resolution. This cannot be emphasized enough, because the only thing worse than site downtime is trying to get help from people who don’t know what they’re doing. It gets ugly in a hurry.
SEO is Here to Stay
The rules of SEO will change — this much is certain (Google tinkers with its algorithm hundreds of times a year). But what isn’t going to change for search engines is the supreme importance of delivering relevant results. This means effective site uptime monitoring is not an option. It is essential, and companies that fail to heed this wisdom will soon be expressing another 3-letter acronym: SOS.
AlertBot is a leading site uptime monitoring solution that checks ALL of the features and functions above, which is why it’s trusted by some of the world’s biggest brands. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes.
]]>Multi-Step Monitoring: Why it’s Essential and How it Works
The term “essential” is thrown around pretty loosely these days. That new show about the hospital (no, not that one… not that one either… yeah that one) is advertised as essential viewing. A newly-released track by a hip hop artist that describes how little they need to release new tracks in order to live much, much better than the rest of us? That’s essential listening. And how can we forget that new muffin that cannot legally be advertised as a muffin, because is technically more of a candy. That’s essential snacking (“mmmmmm….pseudo muffin”).
But then on the other end of the hype spectrum, there are things that are legitimately essential, because going without them could lead to dire consequences — or maybe even a catastrophe. And for e-commerce companies, one tool that truly qualifies as essential is multi-step monitoring.
What is Multi-Step Monitoring?
In a break with tradition in the complex world of technology, multi-step monitoring is pretty much what it sounds like: a way to track the various steps that customers take as they move through pages on a website. This way, businesses can proactively identify and fix problems such as buttons that don’t work, forms that won’t submit, links that don’t go anywhere, pages that take too long to load, and so on.
Why is Multi-Step Monitoring Essential?
Most customers who run into problems don’t shrug them off. They get mad. And that compels them to hit the brakes and head for the exit. In fact, a whopping 88% of online consumers are less likely to return to a site after just one bad experience. So, yeah, preventing about 9 in 10 customers from disappearing is important. One might even say that it’s… wait for it… ESSENTIAL!
How Multi-Step Monitoring Works
In AlertBot, configuring multi-step monitoring is remarkably easy, and doesn’t require an advanced degree in Hypercomplex Supergeerkery, with additional specialized certifications in Megaultra Nerdology. Here is how it works (a video tutorial is also available):
And that’s all there is to it. When the test is complete (this can take up to two minutes), a report is automatically generated that shows:
Tests can be run at anytime to verify that problems are fixed and improvements are made. It’s remarkably easy. And yes, it’s essential.
Learn More
Discover the benefits of multi-step monitoring. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes.
??
]]>The 3-Step Communication Game Plan for a Site Outage (One of Our LEAST Favorite Things)
If those von Trapp Family singers from The Sound of Music collectively woke up in a really, really bad mood and decided to write a song about their least favorite things, then it’s a safe bet that not being able to connect to a website would make the list (alongside airline passengers who tilt their seat back, and clam shell plastic packaging).
Indeed, the level of rage that many people experience when their browser presents them with a “cannot connect to that website” message is enough to trigger a blood pressure monitoring app alarm on a smartwatch. It’s the equivalent of going to a store, only to find out that the door is locked. Actually, it may be worse than that, because at least there could be some therapeutic comfort in commiserating with other disappointed customers. But in the virtual world, the journey is usually solo — and so is the misery.
The bad news is that there is no way to absolutely, completely, and ultimately prevent site outages from happening. However, the good news is that companies can — actually, scratch that: they must — be proactive to mitigate the pain and suffering; both across their site visitors, and for themselves. To that end, here is a three-step communication game plan:
Step 1: Tell the story.
Without delay (not even for lunch), companies should leap into their operational digital properties — e.g. social media, email, SMS, chat, widget, etc. — and clearly describe:
Step 2: Update the status page.
All of the information shared through social media and other channels should be published to a dedicated status page, which — as the name suggests — exists for one purpose only: to highlight and describe the status of a website (or possibly multiple websites that are part of the same brand or portfolio). It is vital to keep the status page updated to reflect the current phase: investigating, fixing, resolving, and resolved.
In addition, the status page should invite visitors to subscribe, so that they can receive real-time notifications when things change — and ultimately, when they get back to normal.
Step 3: Conduct a postmortem and share the findings.
Once the outage is history, companies should figure out precisely what went wrong. Using a top-rated site uptime monitoring tool, like AlertBot, can provide helpful clues, and just as valuably, ensure that there isn’t a repeat performance. This information should be shared with the customer community and all other stakeholders, such as suppliers and strategic partners.
Typically, this information is shared through a blog post, which all social media accounts such as Facebook, Twitter, and Instagram (etc.) point to. Even if the company is not technically at fault (after all, nobody wants to be assailed by a DDoS attack), the fact remains that visitors were inconvenienced. An authentic apology goes a long way to easing frayed nerves and restoring trust.
The Bottom Line
Site outages are dreadful. Yet, they happen, and companies need to have a communication game plan to minimize the frustration for visitors, and the adverse impact on their reputation. The von Trapp Family singers would approve (and probably turn it into a song that you can’t get out of your head, no matter how hard you try).
]]>Why Website UX “Edge Cases” Lead to Visitor Frustration — and What to Do About It
The year was 1993. Beanie Babies invaded the planet. Dinosaurs dominated cinemas worldwide when they escaped from Jurassic Park. Seinfeld won the Emmy for Outstanding Comedy Series (you might say that Jerry & co. were masters of their domain). And righteous rockers Aerosmith extolled the virtues of “living on the edge.”
A lot — and we are talking A LOT — has changed since 1993; especially that advice about living on the edge. Frankly, the last thing that companies want is for their website visitors to go anywhere near the edge, because they may fall off.
Edge Cases
What we are talking about here are “edge cases,” which refer to website UX pitfalls that are unlikely — but nevertheless possible. And when visitors experience one of these edge cases, it is not a matter of whether they will get mad: it is a question of how enraged they will become. Hell hath no fury like visitors thrust into a nasty edge case. Here are some examples:
As a result of these negative experiences, visitors cannot move forward as both they and the company desire — or to use a term from the UX world, their momentum on “The Happy Path” — is thwarted. Fortunately, that is where synthetic monitoring enters the picture.
The Role of Synthetic Monitoring
Synthetic monitoring (sometimes referred to as journey monitoring) is a method of simulating and evaluating the various journeys that visitors take on a website: where they go, what they do, what buttons they press, what forms they fill out, and so on.
With synthetic monitoring, companies can proactively identify and address edge case scenarios, but without having to rely on excessive manual testing or live user monitoring. This is not only more efficient, but it exposes edge cases that would otherwise go undetected.
Ideally, addressing edge case scenarios means eliminating them entirely — such as fixing bad code. But at the very least, companies can put up signposts that point visitors in the right direction. For example, since there is no way to 100% guarantee that every visitor will correctly input their credit card number, a form can be modified to tell visitors when an input error has occurred.
AlertBot: Avoiding the Edge
AlertBot supports advanced and easy-to-use synthetic monitoring that helps companies run and evaluate various UX scenarios before their visitors do — and ultimately reduce edge cases. Hey, Aerosmith is welcome to live on the edge (who are we to criticize the group that brought us Guitar Hero?). But companies that want to drive visitor engagement — and prevent frustration — should live as far away from the edge as possible.
Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes.
]]>We all know the pleasure we feel when we dig into an old pair of jeans and pull out a crumpled $5 bill, or when we finally get around to vacuuming our car (“Hey, I don’t remember eating onion rings in here”) and find a few bucks in loose change. It’s as if the universe has taken a moment to smile on us.
Now imagine that, instead of finding enough money to buy some more onion rings (“Oh yeah, I remember when I ate onion rings in here — wow, that was a long time ago”), you get your hands on a cool $1.85 million. Pleasure isn’t the word for that. Euphoria is.
Well, in a sense, that is what owners, investors, and anyone else who has a financial stake in your organization could feel if you choose a superior site uptime monitoring solution. Why? Because new research has revealed that $1.85 million is the average price tag that organizations pay to recover from a ransomware attack — a figure that has more than doubled in the last year. Let’s unpack this by taking a look at ransomware, and then explaining the link to site uptime monitoring.
What is Ransomware?
Essentially, ransomware is a type of malware that infects a computer, and blocks access to it unless victims pay a fee (a.k.a. a ransom). And if that was not nefarious enough, there are two other things about ransomware that need to be part of the story.
The first is that victims are given a very limited amount of time to pay up. If they fail to do so, then the threat — which is often carried out — is they will permanently lose access to their data, or their data will end up being disclosed on the dark web or elsewhere. The second is that even after they pay the ransom in full, only 8% of victims get 100% of their data back, and 29% get up to 50% of their data back. In the legitimate business world, this kind of chronic non-fulfilment would lead to excessive customer loss, and probably lawsuits and investigations. But on the cybercrime landscape, it’s standard operating procedure. There is no complaints department or review site (“We were very disappointed in this hacker who failed to return all of our data, but we are adding a star because communication was prompt”).
Where a Site Uptime Monitoring Solution Enters the Picture
A superior site uptime monitoring solution cannot block ransomware attacks. For strategies and tactics on that front, we recommend this helpful article at eSecurityPlanet.com, and this site by the Cybersecurity and Infrastructure Security Agency (CISA).
However, a superior site monitoring solutions CAN do something that hackers earnestly hope that potential victims do not realize: immediately alert them to a ransomware attack — even if it’s at 3:00am — so they can rapidly roll-out an uncorrupted back-up and carry on without disruption or (and here is the euphoric part) having to pay $1.85 million or more in ransom/recovery costs.
Then, the organization can move to fortify cybersecurity defenses and reduce the size of the attack surface (probably by deploying many of the recommendations highlighted by the sources listed above), ultimately reducing the likelihood of future ransomware attacks.
The Bottom Line
Ransomware is on the rise, with the number of reported incidents surging 183% between the first two quarters of 2021. A superior site uptime monitoring solution won’t stop these attacks or frankly even slow them down. Hackers are notorious for doing things over and over again until they stop working — and unfortunately, ransomware is quite profitable. But it can give organizations the warning and time they need to strengthen their defenses, and in the process potentially save an average of $1.85 million.
Launch a free trial of AlertBot’s superior site uptime monitoring solution. No credit card. Nothing to download. Setup in minutes.
??
]]>It’s Cyber Week! All new AlertBot signups this week get 20% off for the life of their account! Use promo code 2021CW20 when you sign up to claim this deal! https://www.AlertBot.com
]]>It’s Black Friday all week for AlertBot! All new signups this week get 20% off for the life of their account! Use promo code 2021BF20 when you sign up to claim this deal!
]]>What Exactly is a Website Monitoring “False Alarm”
and Why You Should Care About It
by Louis Kingston
You know what falsehoods are. You know what false teeth are. You may even know some falsehoods about false teeth. But do you know what a website monitoring false alarm (also known as a “false positive”) is? If not, then please keep reading to find out — because it’s a very big deal.
What is a False Alarm?
Remember back in grade school, when the fire bell suddenly went off in class and you were instructed to exit the class single-file and march outside? As you rose from your desk, heart racing, you wondered if you’d ever see your Trapper Keeper, Real Ghostbusters lunchbox and JanSport backpack ever again. But after you and your classmates were wrangled into the parking lot to stand in the brisk autumn air for what felt like an eternity, you soon learn it was just some older kid who thought it’d be funny to pull that shiny red lever on the hallway wall.
Well, that’s essentially what a false alarm is: a result that incorrectly indicates that a particular condition or attribute is present (i.e. it wasn’t a real fire consuming your place of education; it was merely a “false alarm” thanks to that jerk in the grade above yours).
What is a Website Monitoring False Alarm?
What you need and expect from a website monitoring tool is to know precisely when your website goes down. Why? Because research has found that the average cost of site downtime is $5,600 per minute. And remember, we are just talking about the average cost here. Some site downtime fiascos are much more costly. Just ask Amazon, which lost an estimated $99 million after going down for 63 minutes during Prime week in 2018. Granted, most businesses (including yours, unless you happen to be Jeff Bezos) won’t have to shell out $1.65 million a minute due to website downtime, but the basic point should be clear: site downtime is costly, and false alarms are supposed to minimize this financial damage.
But what happens when a website monitoring downtime alarm goes off, but nothing is actually wrong? It gets chalked up to a false alarm.
Why Website Monitoring False Alarms Are So Common
Many website monitoring tools — and virtually all of the free kind — have a test server in one location. If that test server detects that a website is not available, it does the only thing it can: sound the alarm. And that seems to be the correct thing to do, right? Well, not exactly.
Let’s say that that the website in this example is only down for a few seconds due to an isolated power outage. The test server has no way of knowing this (i.e. that the website is back up). And so, it is going to generate a false alarm.
The Solution: Multiple Testing Server Locations
Now, imagine that there are multiple test servers spread out across the country — say, one in New York and one in Los Angeles. The test server in New York detects that a website has gone down, and triggers a red alert (this test server is a big Star Trek fan). But it doesn’t sound the alarm. Instead, 60 seconds later the test server in Los Angeles checks in. If the website is up, then it cancels the red alert. If the website is down, then it confirms the initial diagnosis by the test server in New York, and the alarm goes off.
The AlertBot Advantage
At AlertBot, we hate false alarms even more than our customers. That’s why unlike many other website monitoring tools — and again, virtually all of the free ones — we have test servers located across the U.S. and worldwide. We don’t guess whether our customer’s website is down. We know.
Plus, when it is necessary to send out an alert, our system automatically and immediately contacts key people — such as network administrators, SysAdmins, CIOs, etc. — through email, SMS/text message, or phone (or any combination).
What’s more, our test servers keep checking for website site availability, and provide an update (again, in the preferred method) if it goes back up. We also highlight the amount of time that the website — or a specific portion/page of the website — was down. Our customers use this information to keep an eye on overall website performance, proactively detect problems, and ensure that their web host is consistently meeting uptime standards.
Ready to bid false alarms a true farewell? Then start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>“Frodo, We Aren’t in the Shire Anymore”: The Importance of a Customer Journey & How to Avoid Wrecking It
by Louis Kingston
Fans of Lord of the Rings — otherwise known as “Ringers” — never grow weary of reading or watching Frodo and his fellow Hobbits journey through Middle Earth on an epic quest to Mordor (where rumor has it there now exists a very stylish Starbucks at the base of Mount Doom).
Well, customers who visit a website are on an important journey as well. Granted, it doesn’t involve saving the world from evil entities that never sleep. But it does involve achieving objectives that, ultimately, culminate in a sale — whether that happens on the same visit (e-commerce) or weeks down the road (B2B). And that brings us to the customer journey map.
The customer journey map is a visual tool that enables businesses to identify where, when and how customers engage their brand — and make the trek from curious prospects to enthusiastic brand ambassadors. There are five phases on the journey:
In theory, the customer journey is straightforward. However, in practice — and just as Frodo & Co. discovered — the quest can have many twists and turns. No, there aren’t any orcs, hobgoblins or balrogs along the way, but there are some dangerous foes that include:
The bad news? Any one of these is enough to send customers heading straight for the exit, never to return. The good news? AlertBot’s leading solution continuously monitors for ALL of these from multiple locations around the world — and proactively notifies key individuals (e.g. CIOs, CTOs, SysAdmins, etc.) when a problem occurs.
You could say that AlertBot is leading-edge technology worthy of Gandalf, and yet so intuitive and easy-to-use that Pippin Took could manage everything (even after having a few pints at the Prancing Pony).
See for yourself by starting your free trial of AlertBot now.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>Debunking 3 Website Availability Monitoring Myths
by Louis Kingston
Some myths in life are harmless, or even helpful. For example, Santa Claus has come in very, very handy for parents who want to nudge their kids from the naughty list to the nice one. And let’s give a round of applause to the Tooth Fairy, whose promise of nominal financial compensation has turned the prospect of losing a tooth from a meltdown trigger into a motivational factor.
However, other myths are on the opposite end of the spectrum: they lead to stress and costs. The bad news is that there are some rather notorious website availability monitoring myths out there. But the good news is that debunking them is simple. Here we go:
Myth #1: Free website monitoring tools are just as good as paid versions.
The Truth: So-called free website monitoring tools are riddled with gaps and vulnerabilities — simply because they’re free, and the folks who make them aren’t trying to provide a public service or earn some good karma. They’re in business, and that means there’s always (always!) a hook. Here are some of the drawbacks: zero technical support, excessive false positives, reduced test frequencies, limited testing locations, and s-l-o-w product updates. For a deeper dive into these pitfalls, read our article here.
Myth #2: Buying website availability monitoring from your host is a smart idea.
The Truth: Your web host probably offers website availability monitoring, and keeps pestering you to buy it. What’s the harm? Well, here’s the harm: your web host is a web host. That’s their jam. They don’t specialize in website monitoring, which means that customers like you are going to pay for their lack of competence and capacity. And on top of this, your web host has an inherent conflict of interest when it comes to giving you the full picture — because your hosting agreement includes uptime standards. As such, they may be less inclined to be fully transparent if they fall below this standard. Or to put it bluntly: they might lie, and you’ll have a really hard (if not impossible) time trying to detect and prove it. For more insights on why it’s a bad idea to buy website monitoring from your host, read our article here.
Myth #3: Website availability monitoring is just about website availability monitoring.
The Truth: This last myth is especially tricky. Yes, website availability monitoring is about website availability monitoring. But that’s not where it ends. Comprehensive (i.e. the kind your business needs) website monitoring also analyzes key aspects such as website usability, speed and performance — because there are situations where a website can be available, but not accessible or optimized. To learn more about why comprehensive website availability is not just a technical necessity but also a customer experience requirement, read our article here.
The Bottom Line
Does your kid have a toothache, threatening to go to DEFCON 1? Do a myth tag team of the Tooth Fairy + Santa to avert a meltdown (and hey, you might even enjoy some extras out of the deal like getting them to clear the dishes after dinner or clean out the cat litter — kids are tough negotiators, but see what you can get).
But if you want to keep your business safe and strong, then steer clear of all myths, and equip yourself with the clarifying truths revealed above.
And speaking of clarifying truths: AlertBot TRULY offers world-class, surprisingly affordable and end-to-end comprehensive website availability monitoring — which is why it’s trusted by some of the world’s biggest companies. See for yourself by starting your free trial now.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>How to Solve 6 Common Browser Incompatibility Issues
by Louis Kingston
You have spent a small — or perhaps a large — fortune on your website, and now you’re ready to reap the rewards. You can picture it now: delighted visitors gushing about speed, performance, features, and functions.
Except…that’s not happening. Instead, visitors are running into browser compatibility issues — which means instead of moving forward on the buyer’s journey, they are heading straight to a competitor. That’s the bad news.
The good news is that you can (and frankly, you must) fix browser compatibility issues ASAP. Here are six of the most common problems, along with their associated solutions:
Problem: Various browsers render CSS styles differently.
Solution: Force all browsers to reset to the same basics by using CSS reset style sheets, such as Normalize.css (which is Github-based), HTML5Reset, or Eric Meyers CSS Reset.
Problem: Browsers automatically default to “Quirks Mode,” which results in unresponsive tags and flawed rendering.
Solution: Add this magical line (without the quotation marks) “!DOCTYPE html” at the beginning of the codebase, which forces browsers to operate in Strict Mode vs. Quirks Mode.
Problem: Outdated Javascript fails to automatically detect older browsers.
Solution: Eliminate browser detection and replace it with Modernizr, which rapidly runs various tests to detail all applicable browser functions.
Problem: Unvalidated HTML/CSS leads to coding errors that some browsers do not auto-correct.
Solution: Use tools like W3C HTML validator and Jigsaw CSS validator to catch and fix errors, including the really tiny ones that can lead to major incompatibility headaches.
Problem: Certain functions designed to run on specific browsers are instead running on multiple browsers that cannot handle the request.
Solution: Add common vendor prefixes to the code, such as -webkit- (Chrome, Safari, newer versions of Opera, most iOS browsers, and any other WebKit- based browser), -moz- (Firefox), -o- (pre-WebKit versions of Opera), and -ms- (Internet Explorer and Microsoft Edge).
Problem: Third party libraries aren’t loading and working properly.
Solution: Use trusted frameworks that are cross-browser friendly, such as Angular JS and React JS (web application development framework), Bootstrap and Animate (CSS libraries), and JQuery (scripting library).
How AlertBot Can Help
AlertBot monitors your website with real web browsers — not simulations! — to capture the most authentic end-user experience, and identify problems that others miss. Your development team can use this reliable information to solve problems, and ensure that all visitors enjoy a flawless experience.
Start your FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>4 Essential Failure Analysis Reports for Monitoring Website Performance & Uptime
by Louis Kingston
It would be nice if the same commandment held for websites. However, even an infinity of buzz cuts cannot change the fact that, alas, sometimes websites fail. And so, the question then becomes: how do you minimize the likelihood, duration and severity of website failure?
The answer probably isn’t enough to inspire a movie. But it’s more than enough to help businesses detect and remedy underlying problems with their website before they become full-blown catastrophes: use failure analysis reports.
There are four types of failure analysis reports that every business should be generating on a regular basis: Waterfall Reports, Web Page Failure Reports, Downtime Tracking, and Failure Events.
Waterfall Reports enable businesses to analyze the performance of every object that loads on their web pages (e.g. scripts, stylesheets, images, etc.), in order to identify common sources of bottlenecks, errors and failures. Waterfall Reports also display HTTP response headers, which help track down the source of slowdowns and breakdowns.
Many business websites have dozens of pages, and e-commerce websites can easily have more than 50, 100, or even 1000. Manually hunting for problems can be tedious and futile. That’s where Web Page Failure Reports come to the rescue. They often contain a screenshot of data a page might display during a failure event log. This information can then be used to fix issues before they trigger visitor/ customer rage.
No, Downtime Tracking isn’t the name of one of those bands that never smile when they sing. Rather, it’s a type of report that contains statistics on website and server downtime. Understanding the size, scope and source of downtime issues is critical to resolving them.
Knowing that a web page — or element(s) within a web page — are failing is important, but it’s not the full story. Failure Event Logs fill in the gaps by providing detailed information about what tests were performed, the geographical locations affected, and the errors identified.
The Bottom Line
Are failure analysis reports as gripping and captivating as Apollo 13? No. Are they vital to website performance and business success? Yes. Because while website failure is unfortunately an occasional option, it absolutely cannot become a regular habit.
At AlertBot, we provide our customers with all of these failure analysis reports (and more) so they can get ahead of problems and avoid catastrophes. Start a free trial today.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>Why Your Website Host’s “100% Guaranteed Uptime” Promise is Bogus — and What to Do About It
by Louis Kingston
It’s been said that the devil is in the details. Well, along the same lines — and as we all know from miserable experience — when it comes to guarantees, the devil is in the small print. And there’s no better (or worse) example of this than with respect to the gleaming, confidence-inspiring claim by web hosts that they deliver 100% guaranteed uptime. Except, well, they don’t.
Here’s the thing: what you, everyone you know, and even random strangers in the street define as uptime — i.e. a website being online, operational and accessible — is not how web hosts define uptime. Confused? Of course, you are. To make sense of this, you need to think like a web host.
Multiple Pieces of the Uptime Puzzle
There are multiple pieces of the uptime puzzle: the server on which your website lives, the data center that physically houses multiple servers, the ISP that connects to the internet, and the carrier that links traffic between multiple ISPs. The uptime guarantee offered by web hosts begins and ends with the server and, if they own it, the data center. It does not include issues or problems with the ISP or carrier. As such, if there are points of failure in either of those components, then when your website does go down, your host will technically be meeting its promise. You’ve heard of a non-apology apology? Well, this is a non-guarantee guarantee — and it’s just as lousy.
Less than 100% Uptime = the Same Story
Now, you may have a website host that doesn’t sing from the 100% uptime/zero downtime songbook. It may, for example, promise 99.99% guaranteed uptime, or pledge some other Ivory soap-inspired technical cleanliness standard. Yet again, the same murky logic described above applies: as long as the host’s servers and (if owned) data center are humming along, then it’s an uptime guarantee party and everyone’s invited.
The Real Guarantee
At this point, you may be wondering — and not in a curious, childlike way, but in an agitated “what on earth is going on here!?” way — about what recourse you have available if and when your host does, indeed, bear responsibility for your website going down. That’s where the Service Level Agreement (SLA) kicks in.
Basically, in most cases, the SLA between you and your web host will entitle you to a prorated rebate based on downtime that meets two conditions: 1) the downtime is the responsibility or fault of the web host, and not the ISP, the carrier, the power company, hackers, natural disasters, wizard spells, alien invasion (or just alien visitation), or any other factor that is beyond its control; 2) the downtime can be proven.
So for example, if your business pays $100/month for managed web hosting and your site goes down for half a day— and both of these conditions are met — then you’ll either get around $3.33; most likely as a credit that will be applied to your next bill. Quite the luxurious guarantee, isn’t it?
What You Can Do About It
The bad news is that you can’t demand that your website host’s 100% uptime guarantee is, in fact, a 100% uptime guarantee as you, and pretty much everyone else, would define it. Unless the FCC and FTC decide that this is false advertising (and they haven’t done that… yet), then the splashy promise will remain– and so will the legalese fine print.
But the good news is that you can equip yourself with a globally trusted advanced website monitoring solution like AlertBot, so that you instantly know exactly when your site goes down, why it went down and for how long. You can then use this data to pinpoint problems and fix issues immediately. AlertBot’s popular health map reports deliver crucial performance metrics direct to your inbox to assure you stay on top of your sites. This will also determine whether you should change hosts to one that is relatively better at keeping their promises.
Give AlertBot’s FREE trial a try today. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>7 Tips to Help Remote Workers Secure Their Home Wi-Fi
by Louis Kingston
Do you remember that old song called “Dem Bones” that goes: “ankle bone is connected to the shin bone, shin bone is connected to the knee bone, knee bone is connected to the thigh bone…” and so on? (It’s in your head now, isn’t it?)
Well, legend has it that hackers sing a similar song to their kids that goes: “remote worker’s wi-fi connected to the corporate network, corporate network connected to the privileged accounts, privileged accounts connected to the confidential data.”
True, it’s not as catchy, but hackers have never been about style points. They’ve been about doing what works over and over again until it stops working. And unfortunately, they’re having a ridiculously easy time these days hacking remote worker wi-fi setups, and establishing a foothold from which they launch into corporate networks — often with the goal of deploying malware to harvest confidential data (e.g. customer credit card numbers).
The solution to this problem? Ensure that remote workers fortify their home wi-fi setup, because it is definitely not in full security mode out-of-the-box. The problem with this solution? Maybe remote workers — especially non-technical types — don’t know what to do, and are afraid if they tinker with their router then they won’t just be banished from the land of Zoom conferences and Slack chats with colleagues, but they won’t be able to surf bizarre Reddit subs at 3:00am or watch Minecraft videos on YouTube. What kind of existence is that?
Fortunately, going from Wi-Fi security zero to hero doesn’t require a PhD in Geekology. Here are seven things that remote workers can and should do right now (if they haven’t wisely done so already) to protect themselves and their organization:
The Bottom Line
Will implementing all seven of these recommendations make a home Wi-Fi network impenetrable? No. As long as there is going to be Wi-Fi, there is going to be risk. However, doing all of the above will certainly make it tougher for hackers, and like home burglars, most of them target low hanging fruit. If a Wi-Fi connection puts up a fight, they’ll usually just move on to the next victim until they find one who hasn’t followed the advice in this article.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>6 Tips to Prepare Your E-Commerce Site for the Biggest Holiday Traffic Surge Ever
by Louis Kingston
So it begins.
No, we are not talking about the school year, the football season, or a dizzying array of television shows about zombies, detectives, and of course: zombie detectives (seriously, it’s a thing).
Rather, we are talking about the beginning of what for most ecommerce businesses is the make-or-break race to the end of the year called “gift buying season.” Except this year, things are going to be different.
To understand why, let’s zoom in on what, for most ecommerce businesses, is the most critical period of the gift buying season — Cyber Week — which starts on Thanksgiving, and runs through to Cyber Monday. According to research by BigCommerce.com, during Cyber Week 2019 same-store sales across all verticals increased by a whopping 21% compared to Cyber Week 2018, and the average order value jumped by 10%.
And now, we come to 2020, a year in which billions of people are either obligated or advised to stay at home. These folks aren’t going to even consider hopping into their car to navigate the mall jungle. Instead, they’re going to pause Fortnite, minimize Reddit, crack their knuckles, replace the battery in their mouse, and BUY all kinds of stuff online: from toaster ovens to 60” 4K TVs to luxury sneakers to mounted singing bass fish (remember those?).
Simply put: 2020 is not just going to break e-commerce sales records, but it is going to obliterate them. In fact, in terms of how many people buy stuff online and how much they buy, there may never be another year quite like it in terms of year-over-year surges in volume and value.
For e-commerce businesses, this makes the 2020 gift buying season absolutely critical — which in turn means that crashed or slow websites are NOT OK. In fact, the mere idea of their possible existence is horrifying and just plain unacceptable, like a floating island of fire ants (which, unfortunately, is also a thing).
To prevent a catastrophe worse than anything the Griswold Family might experience, here are six essential things do to:
The Bottom Line
There aren’t many things that can be said with certainty about 2020. However, two things make the list: we will hear the phrase “new normal” at least a thousand more times before the year is up, and the gift buying season for e-commerce businesses is going to be colossal.
Whether that is colossal good (think Avengers) or colossal bad (think the Death Star) will largely be determined by the six essential factors described above. Which epic story do you want your e-commerce business to tell in the months ahead?
Try AlertBot today and see why it’s trusted and recommended by some of the world’s biggest enterprises. There’s no billing information to provide, nothing to download, and you’ll be completely set up in minutes — click here.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host
by Louis Kingston
For baseball pitchers, the two most glorious words in the English language are “perfect game.” For actors, it’s “Oscar win” (forget all that nonsense about how “it’s an honor just to be nominated.”). For school-aged kids, it’s “snow day.” And for businesses, of course, it’s “captive audience.”
Indeed, it doesn’t matter how compelling or clever a marketing and advertising campaign might be. If audiences don’t take notice and pay attention, it may as well not exist. And if you doubt this, think of the last time you sat through 20 minutes of movie trailers — not because you wanted to, but because there was nowhere else to go (at least, not without saying “excuse me…” 10 times as you painfully twisted and squirmed your way past annoyed fellow moviegoers).
Why does this matter? It’s because your web host is singing from the captive audience songbook when it repeatedly urges you to add site monitoring to your existing hosting package. At first glance, this may seem like a good idea. After all, you know that site monitoring is important. Why not just grab it from your web host, the same way you grab a side order of fries from a fast food restaurant? Well here’s why not:
Your web host doesn’t specialize in site monitoring, which means they aren’t using the latest technology or hiring the most qualified professionals. Just as you wouldn’t want your doctor to sell you a timeshare during an exam (“You know what might help that bronchitis? Two weeks a year in a sunny and warm Florida condo, as you can see from this lovely brochure”), you don’t want your site monitoring company to do anything but site monitoring. It’s not something anyone should be dabbling in.
When web hosts offer site monitoring, they typically focus on uptime. But site monitoring isn’t just about letting you know when your site goes dark. It’s also about making sure that your site is performing the way it’s supposed to — which means that all elements are functional (e.g. buttons, forms, multi-step processes, etc.), and all pages are loading rapidly. Without this critical information, you may believe that everything with your site is fine and all lights are green; that is, until you begin hearing from irate customers and start losing sales.
Last but not least, your site host is supposed to meet an uptime standard as part of their service commitment. But if that same host is also monitoring your site performance, they may be less inclined to be completely transparent if they fall below this standard. And if they did fudge some of the numbers, how would you even know? With this in mind, are we saying that all hosts that offer site monitoring are unethical? Absolutely not. Are we saying that there is an inherent conflict of interest that should be at least concerning and troubling? You bet.
The Simple, Smart Solution
The best (and really, the only) way to solve this problem is to avoid it completely — which means not site monitoring from your host, and instead getting it from a proven, reputable vendor that:
Ready to safeguard and strengthen your business with world-class, surprisingly affordable site monitoring? Then you’re ready for AlertBot! We check all of these boxes, and are trusted by some of the world’s biggest companies. Start your free trial now.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business
by Louis Kingston
On Star Trek, there’s an incredibly useful device called the universal translator. As you’d expect, it allows everyone to understand each other. For example, if Captain Jean Luc Picard bumped into a race of aliens that bore a striking resemblance to Commander Riker’s beard, then they could set a date for some Earl Grey tea (hot) thanks to the universal translator. Without it, there might be grave misunderstandings and the firing of photon torpedoes.
DNS: The Next Generation
Well, the internet has its own kind of universal translator, which is somewhat less gloriously called a Domain Name System, or DNS for short. Essentially, DNS is a protocol that establishes the standards for how computers exchange data on the internet, as well as private networks. The purpose is to convert domain names into an Internet Protocol (IP) address, so that computers can identify and communicate with each other. Without the universal language of DNS, surfing the web wouldn’t be surfing at all. It would be more like wading through quicksand because we’d all have to keep track of hundreds, if not thousands, of IP addresses.
How DNS Works
Let’s say that you type “Google.com” into your web browser. Behind the scenes, your browser sends out a request to a recursive name server in order to get the IP address for Google.com (if the recursive name server comes up empty, then the back-up plan is to check with an authoritative name server, which has information on every domain). Ultimately, provided that the website in question exists, the browser is provided with an IP address that tells it precisely where to go.
Now, does this mean that you could type in the IP address and cut out the middleman? Yes. For example, if you really wanted to, then you could type 172.217.10.14 — which is Google’s IP address — into your browser and head straight to Google.com without passing a DNS (or collecting $200). But why would you want to!? A DNS allows you to remember simple names instead of complex 10-digit numbers.
Why DNS Monitoring is Essential: Part 1
The first reason why your business needs DNS monitoring should be self-evident: if for any reason your site name isn’t resolving, then visitors won’t be able to reach it. For all intents and purposes, it will be down. Constant and automated monitoring checks to see that everything is working and there is no need for anyone to scream “RED ALERT!”
Why DNS Monitoring is Essential: Part 2
DNS monitoring also checks to see that the name resolution process is swift vs slow. Why is this so important? Consider this:
Why DNS Monitoring is Essential: Part 3
Hackers frequently target DNS servers to redirect visitors to sites that deliver malware. Even scarier, hackers can obtain SSL encryption certificates that allow them to intercept and decrypt email and virtual private network (VPN) credentials.
The Bottom Line
DNS Monitoring lets you know three things that are more important than not plugging in a hair dryer when the U.S.S. Enterprise goes to warp speed: that your site is up, that your DNS server has not been hijacked by hackers, and that it’s resolving quickly. Without this information, the only way you will know that something is wrong is when angry customers or panicked colleagues start calling.
Boldly Go with AlertBot!
AlertBot automatically and continuously monitors your DNS servers (regardless of where they are located) to ensure that everything checks out, including A records (IPv4), AAAA records (IPv6), aliases (CNAME), SMTP mail server mappings (MX records), DNS zone delegates (NS records), SOA serial numbers, and more. And if an issue is suspected or detected, your team is immediately alerted so they can take action and solve the problem.
Start a free trial now and boldly go with AlertBot!
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>10 Reasons for Site Crashes
by Louis Kingston
In the classic movie The Sound of Music, the whimsical governess Maria and the Von Trapp children sing about their favorite things — like raindrops and roses and whiskers on kittens. It’s joyful, it’s inspiring, and it’s in perfect harmony backed by a full orchestra. Isn’t Austria lovely?
Well, if Maria and co. were running a website (perhaps something to do with selling lederhosen or offering hiking tours in the hills), here are 10 things that absolutely wouldn’t be among their favorite things since they cause sites to crash:
First, the Bad News…
AlertBot’s acclaimed technology cannot prevent these dreadful things from crashing your site — although now that you know what you’re up against, you can be proactive. For example, you should test all plugins/extensions before adding them to your site; make sure that you have the right hosting package, and so on.
…now, the Good News!
AlertBot’s acclaimed technology CAN make sure that your team is immediately notified whenever your site crashes, so that you can take switch action and resolve the problem before your visitors get frustrated and head to the competition.
Try AlertBot free and discover why it will quickly become one of your business’s favorite things. Heck, you might even start singing about it in the halls.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>How To Keep Traffic Spikes from Crashing Your Website
by Louis Kingston
At first glance — and probably second and third as well — having too much traffic seems like a really nice problem to have; like when billionaires struggle to decide which yacht to buy (“I say Thurston, the one with the tennis courts is quite lovely, but the one with the outdoor cinema is so charming”).
However, too much traffic really is a problem, because it causes websites to either dramatically s-l-o-w down (which is terrible) or crash (which is worse than terrible). And right now, as hundreds of millions of people are advised or obliged to stay at home, there are a bunch of e-commerce businesses around the world that are experiencing this harsh, costly reality.
The good news is that your business can — and should — take proactive steps to keep traffic spikes from impaling your website, and causing revenue losses and reputation damage. Here is the to-do list:
The Bottom Line
More potential customers than ever before are using the web to find products and services — everything from digital gadgets to financial advisors to home repairs, and the list goes on. When the surge reaches your virtual address, you want to definitively know — and not just hope — that your website is ready, willing and able to handle the traffic.
Give AlertBot a try for FREE. There’s no billing information, no installation, and you’ll be setup within minutes. Click here
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>Beware These 5 Possible Dangers Lurking in Free Website Monitoring Tools
by Louis Kingston
We’ve been told by the poets that the best things in life are free: A sunrise in spring, the scent of a flower, the coo of a baby, having a buddy who can get his hands on football tickets. It’s all so beautiful and uplifting (especially the football tickets).
But at the same time, the economists remind us that there’s no such thing as a free lunch. And of course, we know from experience that this is often the case. How many times have we taken advantage of a so-called free offer, only to end up disappointed instead of delighted? A handful? Dozens? Hundreds? (And we haven’t even brought up that notorious gym membership yet…)
And that brings us to website monitoring. You know that this is important — or make that vital — to your business’s success. Indeed, going off-the-grid for even a minute can lead to lost sales and lasting reputation damage, and ongoing downtime issues can negatively impact search engine rankings. Hell hath no fury like Google and Bing scorned.
But what you may not know, is that the throng of free site monitoring tools out there may be part of the problem — not the solution. Here are five potential dangers lurking in these tools:
Many free site monitoring tools offer no technical support to help you pinpoint issues and identify potential vulnerabilities and weaknesses. Instead, they provide you with a FAQ (or some other similar resource), and expect you to solve your own problems. You can’t even complain about this, because there’s nobody to complain to.
When is a downtime alert not a downtime alert? When it’s a false positive. These are truly (not falsely) frustrating and terrifying, and they’re a common problem among some free site monitoring tools.
In their marketing, all free site monitoring tools promise to “constantly scan your site.” That sounds comforting. But some of these tools define “constantly” differently than you would— and not in a good way. Several minutes can pass between test frequencies, which means that if something goes wrong, you’ll be left in the dark for quite a while.
Many free site monitoring tools test from one or two locations (which is a worst practice) instead of from multiple locations around the world (which is a best practice).
Many free site monitoring tools don’t get the latest, greatest and safest product updates — because the companies that make them can’t afford to do so. After all, someone has to pay for that stuff.
Why Free in the First Place?
In light of the above, you may be asking a very sensible question: with so many fundamental drawbacks and limitations, why do some companies offer free site monitoring tools in the first place?
In two words: loss leader.
In more than two words: these companies use a free site monitoring tool to get customers onto their roster, after which the upsell parade starts — and it never, ever ends. Eventually, some of these customers end up buying a premium (license/subscription) site monitoring solution at a hefty price tag. The company does a happy dance, rings a bell, updates a giant telethon-like tote board, and smokes a bunch of cigars.
OK, they don’t do any of those things (at least, we hope they don’t), but the fact remains that the free site monitoring tool was never a legitimate, functional business-grade solution in the first place. Economists 1, poets 0.
And Then, There’s AlertBot!
AlertBot isn’t free, for the simple reason that we:
At the same time, AlertBot is refreshingly affordable and makes CEOs and CFOs as happy as it makes CTOs and CSOs. So yes, the best things in life are free. But second best is getting a GREAT deal on a solution that over-delivers. That’s AlertBot. Try it now and see for yourself.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem
by Louis Kingston
Businesses of all sizes — from small startups to large enterprises — are spending an enormous amount of money and time to deliver outstanding customer experience (CX). For example, they’re deploying contact centers, implementing customer-friendly return and warranty policies, training their workforce to be customer-centric, and the list goes on. And now, according to research by Walker Insights, CX is poised to overtake price and product as the most influential brand differentiator. To put this another way: customers are happily willing to pay a higher price, and for a more limited selection, if they’re getting the attention, performance, respect and results they expect — and frankly, demand.
The CX Gap that is Swallowing Customers
However, despite the fact that the CX party has been going on for a while and there’s no slowdown in sight, there’s a gap that many businesses are overlooking — one that is swallowing up their current and future customers, and transporting them directly to the competition: site downtime.
Here’s the thing: traditionally, site downtime has been primarily, if not exclusively, viewed through a technical lens, similar to a car breaking down or a roof springing a leak. And there is obviously truth in this perception. But it’s not the whole story, because customers out there on the virtual landscape equate site experience with customer experience. As such, when a site goes dark, they don’t think: “This customer-centric business has a technical problem with their website, and are surely going to fix it ASAP.” Instead, they think: “Wow, if this is what their website is like, then the rest of the business must be just as dysfunctional.”
Now, is this perception fair? Frankly, no. The vast majority of businesses — let’s say 99% of them — with site downtime truly care about delivering good (if not great) CX. These are the same businesses that, as noted above, are spending plenty of money and time on CX-related investments and training. They seriously and urgently want to get CX right.
But when their website breaks down or blows a virtual tire, this legitimate, longstanding investment and CX commitment is undermined — and customers react accordingly. Here are some of the grizzly numbers:
The Bottom Line
The takeaway here isn’t that businesses need to care more about CX — because they know this already, and (hopefully) are acting on this understanding. Rather, it’s that businesses need to see the direct, immediate link between poor CX and site downtime. It’s not just a technical issue. For current and future customers, it’s the difference between whether they move forward on the buyer’s journey and serve as a profitable brand advisor, or whether they head for the exit and never look back.
Protect Your Reputation + Impress Your Customers
AlertBot delivers world-class, surprisingly affordable monitoring that immediately notifies you when your site is not operational. You can then take rapid, focused action and solve the problem before your customers form the wrong impression — and never give you a second chance to make it right. Launch your free trial of AlertBot today.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations
by Louis Kingston
In the classic flick The Magnificent Seven, a pack of essentially decent but “don’t you dare park your horse in my spot or else you’ll get your spurs blasted” gunslingers come together to rid a village of some nasty bandits. There’s action. There’s drama. There’s tragedy. There’s humor. There’s romance. There’s Steve freakin’ McQueen. What’s not to love?
Well, on the dusty and dangerous internet landscape, instead of a magnificent seven to save the day, there exists seven not-so-magnificent HTTPS errors that are impossible to like, let alone love. Why? Because their purpose is to block visitors from reaching websites — which leads to lost customers and wrecked reputations.
Here’s a look at the reprehensible HTTPS errors that have their picture on Most Wanted Lists in every post office from Tombstone to Dodge City:
403 Forbidden: The 403 Forbidden error means that the server is absolutely refusing — no ifs, ands or buts — to grant permission to access a resource, despite the fact that a request is valid. Common causes include missing index files, and incorrect .htaccess configuration.
404 Not Found: The 404 Not Found error means that a web page or other resource can’t be found because they simply don’t exist. Common reasons for this include a broken link, mistyped URL, or that someone moved or deleted a page and didn’t update the server (which happens a lot).
408 Request Time Out: The 408 Request Time Out error means that the server can’t find the target or resource that it’s searching for, and after a while, just throws in the towel. Often, this is because the server is overloaded.
410 Gone: Whereas (as noted above) a 404 error implies that there might be some hope — i.e. the target file might be somewhere, just not where it’s supposed to be — the 410 Gone error snuffs out any possible optimism. It’s totally, completely and permanently gone.
500 Internal Server Error: The 500 Internal Server Error means that the server cannot process a request for any number of reasons, such as missing packages, misconfiguration, and overload.
503 Service Unavailable: The 503 Service Unavailable error means that the server is either down because of maintenance, or because it’s overloaded. Either way, the server is conjuring up its inner Gandalf and screaming: “YOU SHALL NOT PASS!”
504 Gateway Time-Out: The 504 Gateway Time-Out error means that a higher-level upstream server isn’t working and playing well with a lower-level downstream server. After a while, the downstream server gets the message that it’s not wanted, and says “Oh yeah? Well, I don’t need you either!”
Calling in the Marshall
The bad news is that these reprehensible HTTPS errors, if left unchecked, can cause a lot of damage. Indeed, few things irk and offend website visitors more than seeing an error code. But the good news is that you can call in the Marshall— a.k.a. AlertBot — to restore law and order.
AlertBot constantly scans your site’s pages to watch out for these and other HTTP errors. If and when they are detected, authorized employees (e.g. webmasters, sysadmins, etc.) are proactively notified so they can take swift action and fix the problem.
It’s lightening fast, always reliable, and as smooth as Steve McQueen. Dastardly, good-fer-nuthin’ HTTPS errors don’t stand a chance!
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>Shopping for clothes in person is an entirely different experience than shopping online (and only being able to guestimate how their purchase may look or fit in real life), but we wanted to evaluate the online shopping reliability of two of these brands when it comes to the world wide web and their own individual website performance.
To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both GAP.com and Aeropostale.com from August 4th through August 18, 2019. (We originally planned to evaluate Abercrombie.com instead of Aero, at first, but the site produced so many errors that we decided to choose a different company’s site to monitor.)
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both Aero’s and GAP’s sites achieved 99% uptime. Neither saw significant downtime, which is expected, but each one experienced some sluggish speeds and even load time timeouts on a couple occasions.
Aeropostale.com experienced 99.64% uptime, with over 20 errors recorded due to slow load times or brief periods of unresponsiveness. None of these events lasted longer than a couple minutes, however, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be pretty good. (Aeropostale.com 8/10)
GAP.com experienced fewer issues, but struggled with some significant slowness on August 9th, resulting in 99.50% uptime. Otherwise, they would have an overall stronger performance during this time period than Aero. (GAP.com 8/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.
When it comes to page load times, Aeropostale performed respectably, but at about twice the load time as GAP’s site. Their best day, on average, was Monday, August 5th with 6.1 seconds. Their worst day, on average, was Thursday, August 15th, with 6.8 seconds. The site’s overall average speed across the entire test period was 6.97 seconds, which isn’t terrible, but it also isn’t much to brag about. However, one thing certainly gleaned from these results is that Aero’s site is relatively consistent across the board, in regards to their speed. (Aeropostale.com 7/10)
As teased above, GAP.com performed about twice as fast as Aeropostale.com did. Their best day, on average, was Sunday, August 4th with 2.4 seconds. That’s a pretty decent load time. GAP.com’s worst averaged day was Friday, August 9th, at 3.35 seconds, which is still almost half the time of Aero’s best day. The site’s overall average speed across the entire test period was 2.8 seconds, which is rather impressive. (GAP.com 8.5/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.
When it comes to geographic performance, it seems safe to say that Aero’s site is all over the map. They performed best in North Carolina at an average of 2.6 seconds, with Nevada in second at 3 seconds and Oregon third at 3.1 seconds. Those times are not bad at all. However, their slowest time was a dismal 13.3 seconds (ouch!) in Missouri, followed by 13 seconds in California, and Washington DC in third place at 12.1 seconds. (Aeropostale.com 7/10)
GAP.com also saw some drastic differences on either side of the scale, but not nearly as substantial a difference as Aero’s. Their fastest average performance was seen in Nevada, at 1.7 seconds. Oregon came in second at 1.7 seconds, and Virginia was third at 1.8 seconds. Missouri was once again at the bottom of the proverbially bargain bin with 6.3 seconds, followed by Colorado at 5.21 seconds and Texas at 5.17 seconds. Still, GAP’s geographically slowest times look like Aero’s overall fastest times, which is rather disappointing. (GAP.com 8.5/10)
For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can find a nice sweater (since we’d love to cozy up in this fall weather) and add it to our cart.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.GAP.com into our Chrome browser, it took 39.10 seconds and 8 clicks to get a sweater into the shopping card and begin the checkout process. GAP had two pop-ups about coupons and joining their mailing list, and it took a few clicks to get around those. Then we navigated to the Men’s section, selected the first long sleeve crewneck we found and added it to the cart. (And hey, it’s 40% off, too. Woohoo!)
For www.aeropostale.com, it took 6 clicks and 35 seconds to browse their fall collection, snag a thermal hoodie tee, add it to the cart, and click checkout (and hey, the price was about half-off, too!).
Honestly, both sites are pretty nice, easy to use, and straightforward. The pop-ups on GAP.com were a bit annoying, especially with there being two of them, but it’s tough to gripe about getting offered coupons to save money when you’re shopping. Aero’s site felt just a smidge more inviting, like you’re browsing a tangible catalog, and it seemed to offer quite a few options up front.
All things considered, our Usability scores are:
(Aeropostale.com 9/10)
(GAP.com 9/10)
Both sites performed respectably, but when it comes to speed, one definitely outperformed the other—and the positive usability experience is just gravy. So, we’re pleased to announce this Showdown champion to be…
Winner:
]]>4 Common Causes of Cart Abandonment — and How to Solve Them
by Louis Kingston
It’s a sad story that has become so common, that it just kind of blends into the background — like that awful elevator jazz that some coffee shops play (Thelonious Monk would NOT approve), or economy class in-flight meals (there’s less sodium on a salt lick, and you don’t get rammed in the ankle by a cabin trolley). Alas, we’re talking about the cart abandonment epidemic.
And epidemic is indeed the right word, because this problem is not local or limited. Forrester Research pegs the number of customers who bid adios to their cart at 87%, with 70% of them choosing to do so just before checkout. Overall, $18 billion worth of products each year are left to languish in digital trolleys.
Here are four common and costly cart-based reasons why customers flee the sales funnel, rather than triumphantly complete the buyer’s journey:
Customers don’t merely dislike unexpected costs like shipping, or nebulous “handling” fees (what, are people buying plutonium or something?). They absolutely hate them. There might even be a clinical psychological aversion to this called “unexpectedcostphobia.”
The solution: be transparent about all automatic or potential costs by advertising a clear and realistic estimate, providing a delivery calculator on the home page (not buried at the end of the checkout process), and if possible, offering free shipping for a minimum purchase.
A decade or two ago, customers didn’t mind creating an account to purchase something online, simply because they didn’t know there was any other way. It was part of the deal, like the turning of the earth or standing in line for longer than you should at the post office. It’s going to happen.
But now, customers have enjoyed a taste of the guest checkout experience — and many of them love it; especially if they’re suffering from security fatigue and wince at the idea of remembering more login credentials. Naturally, e-commerce sites that fail to cater to this preference set themselves up for plenty of cart abandonment.
The solution: if creating an account is mandatory, make the process as simple and fast as possible (and then make it even simpler and faster). In addition, give customers an incentive to create an account such as a discount offer, special gift, or anything else that has value and isn’t going to lead to a bankruptcy filing.
In 1970, The Beatles sang about the “Long and Winding Road” and scored yet another U.S. Billboard #1 hit. However, e-commerce sites that have a long and winding checkout process aren’t going to be certified platinum. They’re going to be certified terrified, because cart abandonment rates will be far higher than their competition.
The solution: ruthlessly streamline down the checkout process to the bare minimum, and use as few fields as possible. Yes, getting as much glorious customer data is important — but it’s not as important as getting customers on the roster in the first place.
Even entomologists don’t like website bugs and other completely preventable technical errors that make online shopping irritating instead of enjoyable. Even one of these bugs is enough to trigger cart (and brand) abandonment — let alone a bunch of them.
The solution: use a reputable third-party platform to constantly monitor all important web pages and multi-step processes — such as login, signup, checkout and so on — to proactively detect and destroy bugs, or anything else that makes customers miserable like slow page loading. Learn more about this here.
The Bottom Line
Completely eliminating cart abandonment isn’t possible, because there will always be customers who pause or stop the purchase process. But solving all of the problems described above significantly increases the chances that both carts and customers will get to the finish line, and be inspired to come back for more. And isn’t that the whole point?
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes
by Louis Kingston
In the 1989 flick Field of Dreams, Kevin Costner turns his Iowa cornfield into a baseball field because a voice tells him: if you build it, he will come. The “he” in question is his late father, and the movie has a magical, uplifting ending that makes us want to dream again (and possibly, play baseball or eat some corn).
Well, many folks who launch e-commerce sites also believe that: if I build it, they will come. This time, “they” means throngs of happy, profitable customers. Except…they don’t. And before long, the site is forced to scale down or shut down. Even writing to Kevin Costner doesn’t help — even if you promise to watch a double feature of The Postman and Waterworld (not recommended without a physician’s approval).
The bad news is that this kind of misery happens all the time. The good news — actually, make that the amazing, glorious, Field-of-Dreams-ending-like news — is that preventing this doom and gloom is largely a matter of avoiding these five big, scary and costly e-commerce site mistakes:
Tiny buttons that are impossible to click on a mobile device without a magnifying glass and hands the size of a Ken doll. Search functions that neither search nor function. Elusive top level categories. Gigantic banners that pop open and chase customers around from page to page, like a kind of online shopping Terminator (“I’ll be baaaaaack!”). These are just some of the many ways that lousy UX destroys e-commerce sites.
The remedy? Monitor all pages and multi-step processes (e.g. login areas, signups, checkout, etc.), to identify bottlenecks where customers routinely encounter errors or unresponsive behavior, and fix any gaps and leaks right away. Learn more about doing this here.
Just how vital is speed? Behold these grizzly statistics:
The remedy? Be ruthless about making your e-commerce site as fast as possible (and then make it even faster). Here are the usual suspects: bloated HTML, ad network code, images not optimized, and using public networks to transmit private data. There are other culprits, but look here first — you’ll be amazed at how much speed you unleash.
Let’s talk about health. Some people have poor health because they don’t exercise at all. Their daily calisthenic routine involves digging in the couch for the remote. And then on the other end of the spectrum, there are people who work out too much — like, we’re talking to extremely, unhealthy levels. You know the type.
The same phenomenon occurs in the e-commerce world when it comes to SEO. Some sites don’t focus on SEO, which means they aren’t going to get found by the 35% of customers who start their buyer’s journey from Google. And some focus too much on SEO, that they neglect other channels and tactics — including good, old fashioned pure promotion.
The remedy? Definitely make SEO part of the visibility strategy. But don’t make it the end-all-and-be-all of online existence. It’s important, but it’s not everything.
Customer service is as important in the online world as the brick-and-mortar world, and in some cases it’s even more important, because exiting the buyer’s journey is so simple — as is writing a scathing zero-star review that would have made Roger Ebert wince. Unfortunately, many e-commerce sites treat customer service as an afterthought or a necessary evil, rather than an asset that should be leveraged to optimize customer experience and generate loyalty.
The remedy? Make customer service — characterized by the ease, speed, and quality of responsiveness and resolution — a big part of the plan. It’s not an expense, but an investment.
E-commerce sites aren’t vending machines, yet many of them seem to take their inspiration from these handy contraptions that dispense candy and soda in exchange for money and the push of a button (be careful you don’t press the wrong one — you might end up with that oatmeal cookie that has been there since 2007, and not the Snickers bar that you’re craving).
However, most customers — even those who are very focused on getting a specific item, like a pair of sneakers, a smartphone, or a hotel room — want and expect to access relevant information to help them make a safer, smarter purchase decision. This could be videos, infographics, social proof (e.g. testimonials, reviews, case studies, etc.), articles, blog posts, and downloadable assets like ebooks, checklists, and so on.
The remedy? Don’t skimp on creating original, compelling content. As a bonus, this will help with SEO and can connect you with profitable customers who are not in your primary target market.
The Bottom Line
Competition on the e-commerce landscape for the hearts, minds, and indeed, wallets of customers is ferocious. Avoiding these mistakes will go a long, long way to helping your e-commerce site survive and thrive.
You may even make enough profit to retire early, buy a cornfield in Iowa, and then turn it into a baseball field that inspires the feel-good movie of the year. Hey, it worked once before, right?
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>As technology continues to morph change with the times, the virtual reality experience keeps becoming more widespread and immersive. Two of the leading brands in the VR game are unmistakably VIVE (HTC) and Oculus. Both companies are leaders in the ever-expanding digital world of virtual reality, with both having released or having plans to release new headset models this summer.
While these brands may corner the market on connecting to the virtual realm, we wondered how they stack up when it comes to the world wide web and their own individual website performance.
To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both VIVE.com and Oculus.com from May 1st through May 22, 2019. Given the high regard in which these companies are held because of their products, we expected their web performance to be strong.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both VIVE’s and Oculus’s sites did perform quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.
VIVE.com experienced 99.91% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be quite solid. (VIVE.com 8/10)
Oculus.com performed similarly with 99.98% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a minor hiccup in their performance. They experienced four times fewer of these errors than VIVE, so they ended up coming out just a tiny bit more on top. (Oculus.com 8.5/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.
The speed for both websites were also relatively close to each other. VIVE.com’s best speed, on average, was seen on Monday, May 13 at 3.2 seconds, which isn’t bad. Their best time of day, however, was on Tuesday, May 21 at 5am with 1.6 seconds. It’s definitely better, although it’s doubtful that they usually see a high number of traffic on a given morning. VIVE.com’s worst averaged day was Thursday, May 23rd at just 5.1 seconds. However, their worst time was on Wednesday, May 22nd at 2pm with a much less admirable 8.8 seconds. The site’s overall average speed across the entire test period was 3.78 seconds. (VIVE.com 8/10)
Oculus.com performed very similarly. Their best day on average was Thursday, May 2nd with 3.7 seconds. Their best response time was at 9am on Wednesday, May 15 with 2.05 seconds. Oculus.com’s worst averaged day was also (like VIVE’s) Thursday, May 23rd at just 4.37 seconds (although that’s slightly better than VIVE’s worst). However, their worst time of day was on Wednesday, May 1st at 6am with 7.49 seconds (making their slowest time a full second faster than VIVE’s slowest). The site’s overall average speed across the entire test period was 3.96 seconds (Just a smidge slower than VIVE’s). (Oculus.com 8/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.
Previously, California had reigned supreme as the fastest state in the U.S. But lately, other states have been stepping up, dethroning The Golden State. This time, North Carolina wins (for both sites), with VIVE.com moving at a breezy 1.69 seconds in The Old North State. Oregon came in second at 1.8 seconds, with Arizona at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at a shameful 10.9 seconds, with Washington DC in second at 7.55 seconds and Texas in third at 7.43 seconds. (VIVE.com 8/10)
Oculus.com was also under two seconds with 1.9 seconds in North Carolina. Their second fastest was 2.2 seconds in Nevada and 2.3 seconds in Oregon. Overall, they were pretty close to VIVE. However, while Oculus saw a better overall “slowest” location, the second and third slowest were a little worse. Washington, DC came in at 8.66 seconds, then Washington state at 8.65 seconds, and Texas at 8.55 seconds. For the most part, though, the sites performed rather closely. (Oculus.com 8/10)
For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can order their latest VR headset.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.VIVE.com into our Chrome browser, it took 1 minute and 36 seconds (and a wealth of clicks) to come to the conclusion that you cannot order anything from their website (at least not easily, even though there’s a shopping cart icon on their menu bar), and that viewing a map to “Try VIVE Today” tells us that we have to live in Livingston, UK if we want to visit a store.
For http://www.Oculus.com, it took 3 clicks and 16 seconds to add the Oculus Quest 64 GB headset to our cart and be ready to checkout.
For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, so we’re pretty surprised at how drastically different the user experience was here. To give VIVE a fighting chance – even before trying Oculus’s site – we tried choosing a different headset in the event that maybe the most recent one isn’t available yet, and it still didn’t help. Perhaps the problem is that we’re performing the test from the US and VIVE’s parent company, HTC, appears to be UK-based. After further investigation, however, it appears that the only way to get to a purchasing option on VIVE’s site is to look at the “comparison” portion of the products page. Still, it seems odd that they wouldn’t make it easier and clearer to order their products. (Also, it appears that the webpage ends when you’re scrolling through, but it merely eventually changes the panel you’re “stopped” on as you scroll down, and then it moves you down the page to the next panel before stopping you again. It’s a neat design, perhaps, but no doubt a little confusing at first.)
With that in mind, here are the Usability scores:
(VIVE.com 5.5/10)
(Oculus.com 9/10)
Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—especially when it came to usability. So, we’re pleased to announce this Showdown champion to be…
Winner:
]]>Choosing a Website Monitoring Firm? Ask These 5 Questions Before You Buy — not After
by Louis Kingston
Hey brother, can you spare $5 million?
That’s about what Amazon estimates it lost in sales back in 2013, when its website went down for around 40 minutes. For the math junkies out there, that’s $125,000 a minute, or $2,083.33 a second.
Granted, most businesses won’t suffer this kind of hefty financial setback if their website goes down. Sometimes, it pays not to be a unicorn. However, it’s enough to say that there will be a significant and wholly unwelcome cost — either due to lost sales (as in the case of Amazon), or lasting reputation damage. There can also be compliance issues that lead to fines and sanctions. Fortunately, that’s where website monitoring firms ride to the rescue and avert disaster, right? Well, yes and no.
Here is why: just like any other marketplace, there are good website monitoring firms out there, and there are bad website monitoring firms. Obviously, your mission is to make sure that you choose the former and avoid the latter. But how? All firms promise to offer “comprehensive and robust” web monitoring services. And based on this, you may believe that the only real difference between them is price — which is utterly not the case. There are major categorical differences. And you do not want to discover after you sign (or affix your e-sig) on the dotted line that you’re on the wrong end of an over-promise and under-deliver arrangement.
To avoid that fate and help you filter website monitoring firms worth exploring from firms best avoided, here are seven questions to ask before you buy — not after:
Ensure that you get a fully integrated monitoring platform that covers all of your digital properties —- including your websites, mobile websites, web apps, and cloud services (SaaS) — so that you can access all of the real-time information you need in one place. Juggling multiple tools isn’t just tedious and complicated, but it can lead to errors, oversights and disasters.
Don’t settle for just monitoring the basic availability of your URL. That’s like taking your car into the mechanic for a tune up, and as long as it starts then everything is perfect (and you get a bill for $150). You want to dive deep and monitor full page functionality within real web browsers, verify all elements, scripts, and interactive features (like real clicks and keyboard interactions), and scan for errors to proactively detect problems. You also want the option to monitor any port on any server or device, and track load times since, as we’ve written about, businesses with s-l-o-w websites are hanging out a virtual “Going Out of Business” sign.
Steer clear of (usually empty) promises that installation and setup is fast, easy, breezy, exciting, or any other adjective that you’d expect to hear in a shampoo commercial. You shouldn’t have to install anything whatsoever, and setup should take a matter of minutes — not hours or days.
That groan you hear is the echo of countless IT professionals who have valiantly fought — but lost — the battle to maintain website monitoring tools. End the suffering and be the hero that your IT team needs by choosing a firm that handles all maintenance, including ongoing updates and innovations.
There may be “no such thing as a free lunch,” but there is indeed such a thing as a free trial. The firms on your shortlist should offer you a full two-week trial vs. a few days, so that you can put everything to the test in your environment. After all, you wouldn’t buy a car without a test drive, right? Except in this case, there is no salesperson sitting beside you saying, “what’s it going to take to get you to drive home in this baby?”
The Bottom Line
Choosing the right website monitoring firm — and avoiding the wrong ones — is a critically important decision that, sooner or later, will impact your bottom line: for better or for worse. Asking prospective vendors all of the above questions is a smart and practical way to ensure that your selection is rewarding vs. regrettable.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>If there’s one snack shop you’re likely to find on any given street corner in your city, there’s a good chance it’s either a Dunkin Donuts or a Starbucks (and in some cases, they’re on either sides of the street from each other). Both chains serve up steaming hot caffeinated goodness – at varying affordability in pricing – as well as other sweet treats. And while different areas of the globe may have more common chains than these two, we East Coast natives have regular access to the fresh beans of these common coffee connoisseurs.
It’s no secret that those who rely on a warm, fresh cup of java to get their day started also know these bean beverages affect their daily performance. So we wanted to pose the question – what about the web performance of these respective coffee shops?
To test their website performance, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both DunkinDonuts.com and Starbucks.com from December 1st through Christmas Day, 2018. Given the notoriety of both establishments, we expected their performance to be as strong as their brews, and we weren’t disappointed.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both Dunkin Donuts and Starbucks’ sites performed quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.
DunkinDonuts.com experienced 99.96% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none amounted to any significant downtime. Because of this, we still consider their performance to be quite solid. (DunkinDonuts.com 8.5/10)
Starbucks.com performed similarly with 99.87% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a wrinkle in their performance. They experienced four times as many of these errors as Dunkin, so we have to take that into consideration with our rating. (Starbucks.com 8/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.
The speed for both sites were relatively close to each other. DunkinDonuts.com’s best speed, on average, was seen on Sunday, Dec. 2 at 4.8 seconds, which isn’t stellar by any means, but not the worst either. Their best time of day, however, was on Wednesday, Dec. 19th at 4am with 2.1 seconds. It’s considerably better, but 4am isn’t exactly prime web traffic time. Dunkin’s worst averaged day was Monday, Dec. 17th at 6.2 seconds. However, their worst time was on Saturday Dec. 22 at 9am with a crawling 10.5 seconds. The site’s overall average speed across the entire test period was 5.6 seconds. (DunkinDonuts.com 7.5/10)
Starbucks.com didn’t fare too much better in comparison. Their best day on average was Saturday, Dec. 1st with 5.2 seconds. Their best response time was at 7am on Monday, Dec. 17 with 2 seconds. (It’s interesting that their best average time was on Dunkin’s worst averaged day.) Starbucks’ worst day on average was the previous day, Dec. 16, with 6.9 seconds, with their worst response time on average being at 9pm on Friday, Dec. 7th with a slightly-slower-than-Dunkin’s-speed of 10.7 seconds. But, as you can see, both sites performed pretty close to one another. Starbucks.com’s overall average speed during the entire test period was a tad slower, at 6.3 seconds. (Starbucks.com 7/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
If you’ve been following these competitions at all, you’ll know that no one beats California in website load time speed. However, lately, we’ve been seeing more variety when it comes to which state in the U.S. has the faster speeds. This time around, Nevada wins (for both sites), with DunkinDonuts.com moving at a swift 1.79 seconds in The Silver State. Oregon came in second at 1.8 seconds, with Ohio at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at 10.8 seconds, with Colorado in second at 9.2 seconds and Texas in third at 9.1 seconds. (DunkinDonuts.com 8/10)
Starbucks.com loaded at 1.4 seconds in Nevada, which was faster than Dunkin’s best time. Their second fastest was 1.5 seconds in Oregon and 1.7 seconds in Ohio – all better than Dunkin’s best (1.79 seconds). However, Starbucks saw significantly slower load times than Dunkin, with all of their slowest load times being worse than Dunkin’s slowest. Washington came in at 12.5 seconds, then Colorado at 11.6 seconds, and Texas at 11.4 seconds. While they were a little faster than DunkinDonuts.com, they were also considerably slower, which is unfortunate. (Starbucks.com 7.5/10)
For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find their rewards program and get ready to sign up for it. (And we’re writing about it as we’re performing the test.)
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.DunkinDonuts.com into our Chrome browser, it took 15 seconds and 1 click to find the signup page for their rewards program. (OK, maybe this is too easy?)
For http://www.Starbucks.com, it took one click and 10 seconds to get to the rewards signup page.
For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, but this one probably calls for a retest with a different approach.
Let’s try navigating their respective menus and trying to find out about their coffee items.
With this in mind, from the point of typing in DunkinDonuts.com and navigating through their menu to their coffee options, it took 4 clicks and 23 seconds to get to the page with their regular drip coffee and its nutrition info. It’s a nice website and an enjoyable one to navigate.
With the same goal in mind, for Starbucks.com, it took 5 clicks and over 35 seconds to find the brewed coffee, but the confusing menu setup made it tough to find just plain, hot drip coffee. The Dunkin menu has images for all their options, but Starbucks drops most of the images once you get to the menu, so we ended up on the cold brew menu instead. (As it turns out, it was the fifth option, “Freshly Brewed Coffee” that we actually were looking for… you’d think it’d be one of the first options, though… right?)
Given that the first test was inconclusive, the second one was a clear one for us (albeit unexpected). DunkinDonuts.com was quicker and easier to navigate, and much more user friendly.
With that in mind, here are the Usability scores:
(DunkinDonuts.com 9.5/10)
(Starbucks.com 8/10)
Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—even if just by a little bit. So, we’re pleased to announce this Showdown champion to be…
Winner:
While we’re still recovering from full bellies and empty wallets from the Thanksgiving celebratory weekend, we poured over the performance results for each site to drill in to see how they compared to last year’s event.
As usual, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor all three sites from Thanksgiving Day through Black Friday and Cyber Monday, spanning from November 22, 2018 to November 26, 2018. We expected strong, reliable performance again during the entire run and we were not disappointed. The results were nothing short of impressive. In fact, we were impressed to mostly see improvement this year over last year.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Last year, in an unusual feat, each site experienced not a single error or failure event. The same mostly held true for 2018, but both Walmart.com and Target.com struggled with a few slow file load times (which can cause a page to load slower), but it was never enough to cause any actual site downtime. With that in mind, we think it’s still fine to award 10’s across the board.
(Amazon 10/10)
(Walmart 10/10)
(Target 10/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Last year was the first time we ran this event, so it was interesting to be able to compare last year’s results with this year’s. Ecommerce sites tend to have very graphics-heavy designs, and especially with sale events like these, the graphics are often big, frequently changing, and sometimes even animated or video-driven. (Amazon even had live video streaming at one point throughout the purchasing frenzy!)
With that said, through Amazon.com’s 5-day run, they saw the fastest day, on average, to be Sunday, November 25th with 4.2 seconds—which is almost exactly what last year was (Their fastest was also a Sunday at 4.3 seconds). Their slowest day, on average, was actually on Black Friday itself at 4.5 seconds, which, admittedly, still isn’t too bad. When looking at specific times of day for performance, the best hour was 7AM on Sunday with an impressive 2.6 seconds (an improvement over last year by almost a full second), while the day before saw the slowest hour at noon with a dismal 9.3 seconds (which was significantly worse than last year).
(Amazon 9/10)
Walmart.com was the fastest last year and proved not only to hold that title again this year, but they also showed improvement! Their best average day was Cyber Monday, November 26th at 3.8 seconds. Their worst day on average was Sunday, November 25th, at 4.1 seconds (Coincidentally, it was also Nov. 25th last year, but this year it was almost a full second faster). Finally, their best hour on average was on Cyber Monday at an impressive 1.8 seconds at 6PM. Their worst time on average was 6.9 seconds at 5PM on Black Friday, which is not when you want to be experiencing your slowest web speed.
(Walmart 9.5/10)
Last, but certainly not least, Target.com performed respectably, but once again underperformed in comparison to the other two. Their best day for speed, on average, was Black Friday at 5.4 seconds, which is not only worse than both Amazon and Walmart’s worst days, but it’s .2 seconds slower than their performance last year. Target’s slowest day on average was Cyber Monday, November 26 at 6.3 seconds, almost a full second slower than last year. Their fastest hour turned out to be on Black Friday at 5AM with 3.1 seconds, which is a slight improvement, with their slowest time being on Monday at 3PM with 8.9 seconds, over a second longer than last year, and sadly during mid-day on Cyber Monday.
(Target 8.5/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
California has almost always come out on top as the fastest state, but this year they were consistently dethroned by none other than Oregon! For Amazon.com, the ecommerce mega-site saw average load times of 1.4 seconds in the The Beaver State, with their next-fastest location being Ohio at 1.6 seconds and Nevada at 1.8 seconds. When it came to their slowest locations, Washington, D.C. took the prize at a sluggish 7.5 seconds and Washington state clocking in at 7.3 seconds.
(Amazon 9/10)
Just like in 2017, Walmart.com was faster, but by a mere millisecond, seeing an average load time of 1.3 seconds in Oregon. Nevada and Ohio followed at Amazon’s fastest time, 1.4 seconds. Washington state saw the site’s slowest load time at 6.8 seconds, with Colorado coming in at 6.5 seconds and Texas at 6.3 seconds – all of them being faster than Amazon’s worst locations.
(Walmart 9.5/10)
Target actually saw some improvement this year with its average load time being fastest in Nevada at 2.3 seconds in (last year’s was 2.7 in California), while Oregon came in second at 2.5 seconds and Ohio third at 2.7 seconds. And like last year, Target’s fastest speeds proved to be slower than their competitors. The slowest average speed that Target saw in the U.S. was sadly worse than last year. Washington state clocked in at a truly dismal 10.7-second average load time, with Colorado a second behind at 9.6 seconds, and Texas at 9.3 seconds. It’s unfortunate that Target continues to miss the mark for website speed.
(Target 8.5/10)
For usability, we always select a common task a user might typically try to accomplish when visiting the sites we’re testing and replicate it. For last year’s Showdown, we decided to see what the experience would be like to use these three different websites to add a common product to the shopping cart. To do this, we selected one item to search for and add to our cart, and this year we decided to do the same again.
For each of these processes, we picked an easy item to search for, and sought to add a Blu-Ray copy of Disney and Pixar’s Incredibles 2 to our shopping cart. To begin each process, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.amazon.com into our Chrome browser, typing “Incredibles 2 blu-ray” into the store’s search box, and adding it to the cart, it took 34 seconds. From the front page, it took about 5 clicks (including having to log in to get to the final checkout) to get to the “Place your order” window.
From the point of typing www.walmart.com into Chrome and going through the same process, it took about 6 clicks and 32 seconds to log in and get to the final cart checkout page.
And from the point of typing www.target.com into our Chrome browser, it also took about 6 clicks and 32 seconds to log in and get to the checkout window.
Each site was a good experience to use, although each one has a different feel and approach. It’s a tough call to say which user experience we found to be better, but each one was straightforward and easy to use. If we judge the sites based on search results, Amazon tried suggesting a few things unrelated to the specific search of the “blu-ray” disc first (like a Jurassic Park daily deal and a preorder for Venom), while both Target and Walmart have more direct and accurate results (even though Walmart suggests the DVD and 4K before the actual blu-ray). In that case, we’d have to give Walmart and Target a little more props for accuracy in their product search.
(Amazon 9.5/10)
(Walmart 9.5/10)
(Target 10/10)
With stakes this high once again, you would only expect the best from the leaders in ecommerce, so it comes as no surprise that the results were so good and so close.
With all things accounted for – reliability, speed, geographical performance, and the site’s usability – we’ve reached our verdict, and it surprises even us for a second year in a row:
Winner:
]]>Even though our world continues to creep ever closer to being paper-free—trading our paper tablets for iPads, office supply stores have had to reinvent the way they do business and what their focus is. Staples and OfficeDepotTh are two mega-chain retailers who’ve long been in the fight, regularly providing printing services, as well as day-to-day necessities for the workplace, like pens, calendars, computer accessories, and so much more. And with the all-in-one ecommerce solutions monopolizing the public’s needs (we’re looking at you, Amazon), the desire to shop at these niche market leaders—who typically charge more for the same products—is becoming less and less.
So, for our latest, Showdown, we looked at these two office supply bigwigs and used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 26 to September 16, 2018. After engaging in this different kind of “Office Olympics,” we were expecting the usual quiet response from two reliable websites (i.e. good performance), but instead found what was equivalent to, well, a fun office chair race gone horribly wrong.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both Staples and OfficeDepot’s sites seemed to perform satisfactorily, with neither site ever really seeing significant downtime, but one of them really seemed to struggle with its load time.
AlertBot ended up returning over 800 alerts from Staples.com in the evaluated time span, with half of them being slow files bogging down the page, and the other half being page load timeouts. This doesn’t necessarily mean the site crashes, just that it’s taking unusually long to load. Their site regularly had a pop-up window during this time period promoting signing up for their email list, which seemed to play a part in disrupting the site’s load time and process. (Staples.com 5/10)
On the flip side, OfficeDepot.com performed much better (despite also having a pop-up on its page), but while it seemed to see problems less often, it did experience two failure events, experiencing 98% uptime (compared to Staples’ 100%). The majority of the errors OfficeDepot experienced were slow files or longer load times. Despite this, however, it seems as though its worst times were in the middle of the night (a frequent site maintenance time), which is common for most sites. (OfficeDepot.com 7/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.
Staples.com’s speed tests proved that load times were a regular issue. Its best day, on average, was Monday, September 17th with 7.9 seconds. It’s not the worst load time, but given that most sites are expected to load in 2 to 3 seconds these days, it’s almost three times that. Their best time of day was on Thursday, September 6 at 10am with 3.3 seconds. The worst day, on average, was Friday, September 7th with 10.3 seconds, while the worst time of day was at 1am on Sunday, September 9th with a sluggish 13.8 seconds. (Staples.com 7/10)
OfficeDepot.com actually fared worse, comparatively. Their best day proved to be Thursday, September 6 with 9.9 seconds for the page to load. Their best time of day was at 6pm on Wednesday, September 5th at 6.4 seconds. Their worst is significantly worse, with Monday, August 27th seeing an average of 12.5 seconds, and the worst time of day being on the same day at 3am with 16.8 seconds! (OfficeDepot.com 6/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
Typically, for the geographic tests, California is king, always turning in the fastest response time. For Staples, it’s actually North Carolina, who saw an average of 3.7 seconds of page load time. Washington, DC was second at 4.7 seconds, and New York was third at 5.2 seconds. The state with the slowest results was Missouri with 15.1 seconds and New Jersey with 15 seconds. Oddly enough, California, Florida, Colorado and Virginia all averaged 15 seconds—which is unusual. (Staples.com 6.5/10)
Things were the norm for OfficeDepot, however. They saw their fastest speeds in California, at 7.5 seconds, with Virginia being second fastest at 7.7 seconds. Their slowest performance was Missouri with a crawl of 19.9 seconds, and Utah followed it up at 15.6 seconds. (OfficeDepot.com 6/10)
These aren’t the worst website load times we’ve seen, but they also weren’t anything to brag about either.
For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find an office executive chair and add it to our shopping cart.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.staples.com into our Chrome browser, it took 30 seconds and 5 clicks to search for “office executive chair,” click on one to view its product page, add it to the cart, and click “checkout.” (It had us thinking “That was easy!”)
For OfficeDepot.com, it took about 40 seconds and 6 clicks to get to the checkout process. OfficeDepot had a pop-up as soon as we got to the site which added one click, and then clicking on the cart and going to the checkout seemed to be a clunkier experience.
It’s a tough call for usability, but we did find the Staples checkout process to be a tad smoother.
All things considered, here are the Usability scores:
(Staples.com 9/10)
(OfficeDepot.com 8/10)
It’s surprising how closely these two office supply giants performed – and how disappointing each did as well. Still, neither were so bad that they experienced many full-on website failures, but both could benefit from some serious attention paid to increasing their website speed. Neither site really stands out above the other with its performance, because the good and the bad often balanced each other out, but when it comes down to considering the sheer usability as a tie breaker, we feel the verdict is…
]]>3 Reasons Why Website Speed is More Important than Ever
by Louis Kingston
Today’s business environment is relentlessly fast-paced. Today’s startups blast into tomorrow’s enterprises. And just as rapidly, today’s unicorns take a one-way journey into “hey, whatever happened to…” country. However, there’s another critical piece of the velocity puzzle that many businesses are missing, and it’s costing them customers and profits: the speed of their website.
Speed Kills Lives
Nearly 50 years ago, the government introduced the phrase “speed kills” to warn drivers that going too fast from point A to point B could result in a detour to point C (the police station), point D (the hospital) or point E (the morgue). It was good advice then, and it’s still good advice now.
But when the scene shifts from the asphalt freeway to the information superhighway, speed doesn’t kill anything. On the contrary, it keeps websites alive as far as visitors are concerned. Here are the 3 reasons why:
The word “bouncy” has a happy and positive feel to it, while the word “sticky”…well, it doesn’t. Nobody shows up to a birthday party excited to jump around in the sticky castle, and swimming pool diving boards wouldn’t be doing their job if people stuck to them (although it would be kind of hilarious).
But when it comes to websites, sticky is glorious and bouncy is dreadful — and that’s where speed makes a massive difference. A study by Kissmetrics found that a one second delay in load time can send conversion rates plunging by seven percent! Think about that. Actually, don’t think about that. Just read this sentence. That took a whopping two (!) seconds.
An old joke in the SEO world goes like this: “Where’s the best place to hide a dead body? Page two of Google.” (And in related news, an old conversation among psychologists is: “Why do SEO people make jokes about hiding dead bodies?”)
Macabre humor aside, the point is simple to understand: for most (if not all) of their keywords, businesses either need to be on page one of Google — and preferably in the top three positions — or they might as well be advertising in the Yellow Pages (ask your grandparents).
Once again, speed is a big part of the SEO story. Google — which is obsessively secretive about how its algorithm works (the first rule of Google Search Club is that you don’t talk about Google Search Club) —has actually gone ahead and formally confirmed that page speed is a significant SEO ranking factor for mobile and desktop searches.
The moral to this story? All else being equal, a website that loads faster will rank higher than a website that loads relatively slower. And in the long-run that could mean the difference between surviving or shutting down.
Einstein revealed that time, quite literally, is relative. But you don’t have to become a physicist or get yourself on a million memes to experience the deep truth of this in your bones. Here’s a fun little experiment:
Imagine that your favorite football team is losing a very important game. It’s late in the fourth quarter, and your beloved team is behind by six points. Although the clock is ticking down one second at a time, in your view the time is racing by. Surely, the clock must be rigged!
Now, imagine that your team is ahead vs. behind. The clock is still ticking down one second a time, but to you it’s not racing — it’s grinding slowly and painfully. Yet again: the clock must be rigged!
What this simple example demonstrates is what psychologists dub the perception of speed. Essentially, this means that our emotions influence how we grasp the velocity of passing time. Just a few seconds can seem like the “blink of an eye,” or a tedious wait — as we all know from toiling at the (not-so) express line in the grocery store.
The direct link to website speed here is unmistakable: visitors dislike waiting for websites to load. Actually, they hate it. Each extra second exponentially adds to their unhappiness, and makes it more likely that they’ll exact revenge by smacking the back button on their browser — never to return.
No, this doesn’t mean that websites must load instantaneously, like flipping channels on a TV. Technology isn’t there yet, and visitors aren’t unreasonable or unrealistic. But yes, it does mean that speed is connected UX, and ultimately, with brand: fast loading times creative a positive experience and emotions that are associated with the brand, while slow loading times do the opposite.
The Bottom Line
Website speed has always been important. But these days, it’s crucial — and in many cases, it’s THE MOST IMPORTANT factor. After all, it really doesn’t matter how amazing a website is and what it offers, if visitors never get there in the first place.
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>
While on the Run Wild 5K/10K trail, we ran through the Trexler Game Preserve, an 1100 acre animal sanctuary. The race finished inside the Lehigh Valley Zoo, which led everyone past its animal exhibits, including camels, zebras, and kangaroos. A few of us even stuck around after the race to mingle a bit with the zoo’s various furry residents.
The AlertBot team is excited to be able contribute to such a noble cause as Animal Conservation, especially with thousands of species remaining endangered today. All proceeds from the race went to benefit Lehigh Valley Zoo’s animal conservation efforts, which raised over $25,000 last year and nearly doubled that this year, raising $40,000.
Run Wild was a success, and we can’t wait for the next opportunity to strap on our sneakers and join in the efforts to make a difference in our community!
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from July 1 to July 22, 2018. As both sites and services are continuing to grow and change (Heaven knows MoviePass will probably change their rules and operations again before you finish reading this sentence), we weren’t surprised to see how similar the sites for each service performed.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both MoviePass and Sinemia performed well here, but one did seem to struggle a little more than the other.
MoviePass.com experienced a 98.2% average uptime due to several days where the site seemed to perform slower than usual, causing the pages to not load fully – even triggering a strange account lookup error on the front page for several hours on July 14th. This resulted in 18 failure events cataloged by AlertBot, with an average failure time of 32 minutes. This doesn’t mean downtime, per say, but the details did show that the site was struggling with its speed and load times. (MoviePass.com 7/10)
Comparatively, Sinemia.com saw 99.98% uptime with 1 failure event, although it wasn’t anything that spelled major downtime. At worst, it appeared to be a slow page / busy error that didn’t last long enough to qualify as site downtime. Overall, Sinemia proved to be pretty reliable. (Sinemia.com 9/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.
MoviePass.com saw acceptable page load speeds overall, with their best average day being Wednesday, July 4th with 3.9 seconds. The best time of day was 1am on Friday, July 20th (which isn’t a popular time to even be using a site like theirs) at an average of just 1.6 seconds. On the other side of the proverbial coin, the slowest day was Saturday, July 14 with an average time of 8.9 seconds, and the worst time of day was also on the same day at noon (yikes!) with an embarrassing 14.1 seconds. (MoviePass.com 7.5/10)
Sinemia actually didn’t perform too much better, with their best average speed for a single day being Saturday, July 21 with 5.4 seconds and their best time of day being Wednesday, July 4th at 5pm with 2.7 seconds. Their slowest day was Monday, July 23rd with 7.3 seconds, with the slowest time being on July 2nd at 10pm with 10.2 seconds. (Sinemia.com 8/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
MoviePass.com performed the fastest in California with 1.8 seconds, with Florida coming in second at 2.4 seconds. The site performed slowest in Missouri with a sluggish 10.2 seconds, with Utah coming in second at 8.5 seconds. (MoviePass.com 8/10)
For Sinemia.com, California was also the fastest at 2.9 seconds, and Virginia was second fastest at 3.5 seconds. Missouri was also the slowest, at 11.3 seconds, with Utah being second slowest at 9.1 seconds. (Sinemia.com 7.5/10)
Neither site was all that impressive in the nature of speed – which is interesting considering there isn’t a whole lot of content on their websites to slow them down.
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to start the service signup process (but not complete any forms).
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.moviepass.com into our Chrome browser, it took a mere 18 seconds and 2 clicks to see their plans and get to the signup form. It was a piece of cake.
For Sinemia.com, it was actually just as smooth. In 17 seconds and 2 clicks, we were able to select a plan and get to the signup page.
It’s a tough call for usability. They’re simple processes, but they get the job done and we have no complaints.
All things considered, here are the Usability scores:
(MoviePass.com 10/10)
(Sinemia.com 10/10)
The usability usually isn’t this straightforward and clear for both sites, so it leaves us to look almost exclusively at the other categories to draw a conclusion.
Without assuming MoviePass may have more hiccups in speed due to a greater deal of traffic, Sinemia.com seems to be a clearer choice for reliability as a whole, but the sites are quite close. That bad day on July 14 also really hurt MoviePass’s performance during this evaluation period, but it can’t be ignored. So, with that said, we believe the verdict is…
Winner:
]]>
How To Reduce HTTP Requests To Speed Up Your Site
by Louis Kingston
Most of us are blissfully unaware of the technical feats happening in the background when we browse to a webpage. We typically only notice that there is some technical failure when the site we’re visiting takes such a long time to load that we get impatient and click refresh or the site outright displays an error message. In actuality, there’s a lot that goes on between your web browser and the web page you’re visiting.
For digital marketers, it might seem that their expertise only needs to be focused on Search Engine Optimization, content marketing, and Pay Per Click. But, their hard work is never going to see the page one light of day on the Search Engine Results pages if the page’s load speed is extremely slow. Visitors will just click away to a site that loads quickly and without any errors.
In Pursuit of a Better User Experience
One of the main reasons for a slow site speed is a high number of HTTP (Hyper Text Transfer Protocol) requests.
In a nutshell, an HTTP request entails the following procedure when you decide to visit a website:
Are You Taking Too Long to Respond?
Imagine it is your website that is being visited. Before the files can display on the visitor’s browser, a separate HTTP request will be made for every file that makes up that pages (images, javascripts, style sheets, etc). There can be many large files that can take an extremely long time to download. Most great sites these days have data that carry high definition images and can result in slow load times. This makes the Google Algorithm unhappy, and you get penalized. This can cause you to lose your top spot on the search engine results page. On top of losing position in search results, it’s shown that your potential visitors will not stick around to wait for a slow page to load, and you can say goodbye to conversions. In fact, 47% of visitors to a site want to see a load speed of fewer than 2 seconds! (KISSmetrics report). After three seconds, you can expect 40% of people to hightail it out of there to find a faster solution.
The ideal number of files that make up a single web page is 10-30 files, but these days we see the number of HTTP requests balloon to over 100 per page on some sites!
How Can You Lower Your HTTP Requests?
Sounds technical, right?
AlertBot can provide you with the tools you need to pinpoint performance issues and help set you on the right path to better website performance. AlertBot offers a Free 14-day trial (without collecting any billing info). Give us a try!
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>Whether it’s designing a centerpiece for home or an event, perusing the aisles for tools for a school project, or locating a frame for that beloved photograph, it’s likely you’ve found yourself inside an arts and crafts store at some point. From cloth patterns to drawing pencils to blank canvases and custom framing, these craft supply stores are just what creative people look for in a retailer.
With the rise of ecommerce, arts and crafts stores are just as accessible from the comfort of your computer or mobile device. For artists and crafters, something is undoubtedly lost when shopping online for these kinds of supplies, but the ease of online shopping is undeniable. Two of the biggest players in the market are Michael’s and A.C. Moore, so for this, our ninth, Showdown, we’ve pit the web performance of these two leading crafty retailers against each other.
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from March 25, 2018 to April 8, 2018. As expected, both sites performed quite well, but as in most cases like this, one site saw better results than the other.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both websites did really well here, with neither site seeing any significant, true downtime.
Michaels.com experienced 99.9% average uptime due to 2 page load timeout failure events (where something on the page takes a bit longer to load, slowing the page’s overall performance down). When drilling down to see what errors Michaels.com returned, it signaled 17 instances where the page took longer to load than expected, and 15 times where something on the page took too long to load and slowed the page down. Still, despite the 2 timeouts, Michaels did well overall. (Michaels.com 8.5/10)
Comparatively, ACMoore.com saw 100% uptime with no significant failure events. However, there were still 4 recorded moments where there was a slow file and 4 occurrences of when the page itself took longer to load than expected. Still, ACMoore.com never actually went down, so we have to give them high marks for that.
(ACMoore.com 9.5/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Michaels.com saw pretty decent page load speeds overall, with their best average day being Wednesday, April 4th with 3.5 seconds. The best time of day was 6pm on Friday, April 6th at an average of just 2.1 seconds. On the flip side, the slowest day was Sunday, March 25 with an average time of 6.8 seconds, and the worst time of day was Sunday, April 8 at 8pm with 6.7 seconds. (Michaels.com 8.5/10)
ACMoore.com was truly impressive with their load time. Their best day—Tuesday, March 27 with an average of just 1.5 seconds! A.C. Moore’s best time was even faster with Wednesday, April 4th, at 10pm seeing a load time of just 1.2 seconds. Even more amazing was the fact that ACMoore.com’s worst day—Thursday, March 29–saw an average load time of 1.8 seconds! Their worst time, however, was significantly longer (in comparison) at 3.8 seconds on Thursday, April 5 at 3pm. (It’s interesting that both slower speeds were on a Thursday.) It was really a rarity that ACMoore.com went over 2 seconds in load time, and for that, we have to applaud their excellent web performance. (ACMoore.com 10/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
California continues to reign supreme as the leading location in speed. Michaels.com loaded within 2 seconds (on average) in California, with Florida seeing the second fastest speed of 2.5 seconds. Missouri turned out to have the slowest load time of 7.1 seconds, while Utah came in second-to-last at 4.9 seconds. (Michaels.com 8.5/10)
For ACMoore.com, California is the fastest, once again, at an average of just 1.9 seconds. The second fastest, again, is Florida with 2.4 seconds. The slowest speed time is also seen in Missouri at an average of 8.2 seconds, with NJ coming in second-to-last at 5.5 seconds. It’s interesting to note that ACMoore.com proved to have faster speeds than Michaels, but also slower speeds (when it comes to loading in specific locations). (ACMoore.com 8.5/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find some paint brushes, add them to the shopping cart and start the checkout process.
For each of these processes, we started by opening a new tab in Mozilla Firefox and typing in the site’s URL.
From the point of typing www.michaels.com into our Firefox browser and searching “paintbrushes” in the product search box, it took 30 seconds and 4 clicks to select a pack of brushes, add them to the cart and view the cart. It was definitely a smooth experience.
ACMoore.com was, unfortunately, a far more frustrating experience. Upon visiting the site, we were hit with a pop-up asking for us to signup for their email list to get a coupon. Plus, their signup box at the top of the page is typically where a site search would go, so it’s easy to mix them up (despite the “Sign Up for Offers” label next to it). It didn’t take long to discover that their site also doesn’t seem to specialize in craft materials, as a search for something as basic as “paintbrushes” returned nothing. We tried altering the wording in our search a bit but gave up after reaching a minute and a half.
To be fair, we decided to run the usability process again with different search criteria. ACMoore.com seems organized by craft project ideas, without any real discernable things you can purchase from their site (and yet, they have a shopping cart), which makes the sites quite different from each other (and gives Michaels.com an edge over ACMoore.com in sheer product availability and variety). In the end, while the brick and mortar stores are very similar, their online presences are not. So we decided to run it again to see how fast we can get to, and briefly look around, their individual Weekly Ads.
For Michaels.com, it took about 2 clicks and roughly 10 seconds to get to the Weekly Ad for May 6 and start clicking around. It offered two choices for ads, but we chose the basic ad for the week to browse. It was a very easy experience.
For ACMoore.com, it took 20 seconds, 3 clicks and typing in our zip code to get to our local area A.C. Moore store’s ad before we could start clicking around. The ad isn’t nearly as thorough or as nice as Michael’s is, either.
All things considered, here are the Usability scores:
(Michaels.com 10/10)
(ACMoore.com 3/10)
When it comes to speed, ACMoore.com bested their competitor, Michaels.com, but given the lack of substance and actual storefront of ACMoore.com, it may not be too fair to compare them. However, a quick lap through the aisles of both brick-and-mortar stores for each brand will show just how similar each store is. So, with taking everything into consideration, and both sites performing very well when it comes to the actual site reliability, it’s hard not to give weight to the user experience when making the final conclusion…
]]>It may have been squashing a goomba while punching a coin out of a brick, dodging barrels being thrown by a grumpy gorilla, sorting oddly shaped falling blocks into interlocking patterns or simply catapulting miffed fowl at a group of defenseless pigs on your mobile phone, but chances are high that everyone has played a video game at one point in their life.
Poor web performance is no game any self-respecting owner of a website should play. We recently aimed our sights at the gaming industry and picked out two heavy hitters to evaluate: Xbox and Playstation. While their websites may not be the main point of interest for gamers, they’re relied upon for information, updates and even online digital game sales. Their online gaming servers may be the most important thing to keep running smoothly in gamers’ minds, but these top players in the industry will want to make sure their website stays up and always accessible.
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from February 4, 2018 to February 25, 2018. Both sites performed well—as can be expected from parent companies Microsoft (Xbox) and Sony (PlayStation)—but, as usual, one performed just slightly ahead of the other, even if not by much.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both websites experienced 100% uptime, but both sites encountered minor errors that served as a few speedbumps along the way. Still, it wasn’t enough to qualify as downtime.
Xbox.com, despite its 100% uptime, experienced around 50 “slow page” warnings and over 20 page load timeouts (where something on the page takes a bit longer to load, slowing the page’s overall performance down). Xbox.com also returned an SSL Certificate expiration notice. However, none of these qualified as significant outages, and for that we still have to give them props. (Xbox 9/10)
Playstation fared the same with 100% uptime and a lot better when it came to the little errors. They only registered 7 timeouts and 5 slow page loads, and for that we give them slightly higher marks. (Playstation 9.5/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Speed is crucial to the gamer – be it game load times (who else hates waiting for spinning icons to finish to get us past a cut scene or moving on to a new map in a game?) or server responsiveness – so a speedy game company website is key. Xbox.com experienced pretty quick load times, with its best day being February 24th with an average of 4.6 seconds. Its best response time, however, was on February 23rd at noon with 2.2 seconds. On the flipside, its worst day was February 12 with 6.7 seconds (which isn’t all that bad), but their worst hour proved to be on February 11th at 11pm with a sluggish 13.1 seconds. (Xbox 8.5/10)
Surprisingly, Playstation turned out to be just a little bit slower, with their best day average being 6 seconds on February 22nd. Their best time by the hour was on the same day at noon with 2.3 seconds, just a hair slower than Xbox’s best time. Their worst day was a full second longer on February 11th with 11.7 seconds, and their worst time by the hour was also 13.1 seconds, but on February 10th at 7am. (Playstation 8/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
California seems to win out most of the time as the fastest location for load times and for Xbox.com, it was no different. California saw load speeds of 2.1 seconds on average, with Florida coming in second at 2.2 seconds. Georgia, however, saw an average worst time of 10.3 seconds with Missouri coming in second at 9.2 seconds. (Xbox 8.5/10)
Playstation.com actually turned in slightly more sluggish results geographically, too. Their best location was California, as well, but it was 2.5 seconds, and Florida was a close second at 2.7 seconds. Playstation’s slowest spots were also in Georgia and Missouri, at 12.6 seconds and 11.2 seconds, respectively. It’s not the worst we’ve seen, but Xbox clearly performed better. (Playstation 7.5/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add a digital download of a popular video game to the shopping cart and start the checkout process.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.xbox.com into our Chrome browser and clicking around to find the Xbox One games, choosing the featured one (which, in this case was Dragonball FighterZ), clicking “Buy Now” and getting to the account login screen, it took 1 minute and 10 seconds. From the homepage, it took 7 clicks to get to the checkout process. It’s been a while since we’ve last visited their site, so our experience was fresh, but we encountered some significant slow loading times when getting to the product page. We actually added an additional click to the process because the “Buy Now” button didn’t load properly at first (and did nothing upon its first click). Overall, we got to do what we set out to do, but the process could have gone a lot smoother.
We were hoping for a better experience from Playstation, and we got one. From the point of typing www.playstation.com into our Chrome browser, it took 4 mouse clicks and 35 seconds to find a featured video game (in this case, Bravo Team), and get to the checkout stage (which was also an account login screen). There was some delay on first clicking on the game title, but it still loaded quickly and allowed us to get to the end of the process fast.
Both sites allowed us to get the job done in a rather speedy manner, but Playstation’s site gave us a much more positive experience.
With that said, here are the Usability scores:
(Xbox 8/10) (Playstation 9.5/10)
Both sites performed very well, but that positive user experience helped push one over the other, albeit only slightly. So while it was a tough call to make, we have come to a conclusion —
]]>Congratulations, you have just leveraged an awesome Software as a Solution (SaaS) service for your organization. Perhaps you have implemented a popular application – like Office 365, SalesForce or Dropbox – to support your staff and enhance collaboration between teams. Now you need to ensure that your employees and / or customers are happy too.
At this point, a common misconception often arises: the belief that a SaaS application relieves businesses of all responsibility for monitoring the application. It is just a matter of time before your business is rudely awakened to reality when customers start complaining about outages or poor performance on social media, and overloading the support desk with calls.
A negative customer experience when utilizing one of your SaaS applications can affect your bottom line. Unfortunately, you cannot totally rely on your provider to keep the system ticking; even the big guys experience outages and cyber attacks. Synthetic monitoring provides a solution, a way for you to keep your finger on the pulse of your cloud services.
Taking responsibility for SaaS applications
Effective SaaS monitoring is measured by how positive the end-user experience is. For instance, if a user cannot log in to an application to retrieve a file you sent them, they will not be happy. Can you leave it up to a SaaS provider to keep you up-to-date when they have a problem? No. In fact, it is not unusual for SaaS providers to delay making press statements when they experience problems or not announce them at all. Organizations are fast realizing the importance of taking the responsibility of proactively monitoring the performance of the SaaS applications they use themselves.
In addition, in 2016 Gartner predicted that by 2018 50 percent of enterprises with more than 1,000 users would use cloud products to monitor and manage their use of SaaS and other forms of public cloud. This reflects the growing recognition that, although clouds are usually stable, monitoring applications requires explicit effort on the part of the cloud customer.
Why do you need to monitor your SaaS applications yourself?
Monitoring the customer experience (CX)
Synthetic monitoring has immense benefits for monitoring SaaS applications. It can help you keep a finger on the pulse of your SaaS application by addressing the following core issues that affect the customer experience and can affect your bottom line:
5 top advantages of using synthetic monitoring for SaaS applications
Synthetic monitoring for SaaS is growing in leaps and bounds
According to a MarketsandMarkets.com report, “Synthetic Monitoring Market by Monitoring Type (paywall),” the enterprise synthetic application monitoring market size is expected to grow $919.2 million in 2016 to $2,109.7 million by 2021, at a CAGR of 18.1 percent from 2016 to 2021. The report predicts that “SaaS application monitoring is expected to gain maximum traction during the forecast period.” Don’t get left behind.
Conclusion
A 451 Research study found that the rapid growth of public cloud services and network virtualization has often outstripped management and monitoring capabilities, creating “blind spots” in network operations’ ability to maintain internal uptime and performance benchmarks. If you only recently climbed on the SaaS bandwagon, it is likely that your existing system monitoring tools are not cloud-friendly.
You may need some help from the experts to help you keep your finger on the pulse of your new SaaS application. Mosey along to AlertBot for more information about a holistic synthetic monitoring solution.
]]>Whether you’re hitting the gym or the trails, you’re likely to be lacing up with some active footwear that helps you burn calories and exercise in comfort and style. When it comes to activewear, there are many companies these days who contribute their accessories and gear to our daily workout regiments, however, two major players come to the front of our minds when it comes to popular footwear brands.
For our latest AlertBot Showdown, we picked frontrunners Nike and Reebok to evaluate the website performance for each athletic wear’s online persona.
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from October 1, 2017 to October 22, 2017. While both sporty sites performed well, it became pretty clear after a significant trip-up that one site left the other in the dust.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
For the first time in our experience of tracking sites for a Showdown, one of the sites in the running went down while we were actually in the office. That gave us the ability to watch the event as it unfolded while AlertBot performed its tests against the failing site. Reebok.com hit a snag on October 13 around 3:30pm EST. It took nearly a full hour for their site to recover. We manually checked their site from our desks at 4pm, and the site was still down. We checked again at 4:15 and the site was back up, however, only text was loading – no images. By 4:30pm, when we checked one more time, the Reebok.com was back up in its entirety. It was the only failure event that Reebok.com encountered during the weeks it was tested for this Showdown, but it was definitely a doozy. During this time period, their average downtime was just 99.85%, but it’s proof that “99% uptime” can still contain an hour of critical downtime. And for a retail site, this could truly prove costly. (Reebok 7/10)
On the other hand, Nike.com experienced no significant failure events and only occasionally experienced minor issues like a slow page file or a “timed out” error. From the starting line, Nike is already on the fast track to success between the two brands. (Nike 9.5/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Speed is everything for the image of brands like these, which makes it a bit ironic that both sites seem to struggle a little in this area. Reebok’s fastest average speed was on October 4th with 6.4 seconds load time. Their worst average speed was October 23 at 7.9 seconds. They’re not drastically different, but that’s not an impressive load time. (Reebok 7/10)
At this point, one might expect Nike to sprint past Reebok in the load time category, but Nike didn’t fair much better, with 6.3 seconds being their fastest average speed on October 23 (which is coincidentally the day of Reebok’s slowest average), and Nike’s slowest average speed was 7.5 seconds. Again, they’re not great speeds, but in this case, Nike edges out Reebok, even if it is only by a slight skip rather than a jump. (Nike 7/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
Looking at site response time geographically tells a different story. First off, Reebok shows that they had the fastest load time in Texas with an average of 3.7 seconds. Their second fastest time was in New Jersey at 4.8 seconds. Virginia produced the slowest return, with an average of 6.9 seconds. (Reebok 7.5/10)
Yet again, Nike only performed slightly better, with California showing the fastest average speed of 3.2 seconds and Texas showing the second fastest at 4.5 seconds. However, Nike performed worse than Reebok when it came to slowest location, with Illinois taking the cake for worst average speed, at 9.7 seconds! (Nike 7/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater, or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add their latest running shoe to the shopping cart and start the checkout process.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.reebok.com into our Chrome browser and clicking around to find a Men’s Running Shoe, choosing the first one, choosing a size, adding it to the cart and clicking “checkout,” it took 36 seconds. From the homepage, it took 5 clicks to get to the checkout process. At first glance at the homepage of the site, it seemed like it might be a challenge to actually find what we’re looking for, but it was a pretty easy shopping experience.
From the point of typing www.nike.com into our Chrome browser, it took 8 mouse clicks and 48 seconds to find a men’s running shoe and get to the checkout stage. Upon first visiting the site, the visitor is hit with an ultra closeup of a bunch of kids in gray Nike hoodies and it takes most of the page hostage. We scrolled down to the first running shoe advertised and clicked on it, only to find that it was only a women’s shoe (which is not mentioned on the image on the homepage). We then had to click around to the men’s department, for this task’s purpose, in order to find a shoe and continue the process. Both sites get the job done, but Reebok was a more pleasant shopping experience.
With that said, here are the Usability scores:
(Reebok 9/10) (Nike 8/10)
Both sites performed respectably, but we can’t ignore that failure that Reebok experienced on the 13th. Other than that, the sites performed quite similarly (and we actually preferred Reebok’s shopping experience a little more than Nike’s). Still, since we’re really weighing in here on web performance, the winner is rather clear —
]]>It’s that time of year again, where sales conscious bargain chasers brave the throngs of other sale hunters in the frigid November early morning air on that most dreaded of retail shopping days: BLACK FRIDAY. Just hours earlier, many of these same credit-card-wielding warriors were huddled around a table with family, giving thanks once again while stuffing themselves to their waistline’s discontent with mashed potatoes, roasted turkey and homemade pie. The juxtaposition of these two contradicting practices is staggering, but it’s no less the holiday tradition year after year.
As we approach another Christmas holiday, the world of ecommerce continues to ramp up the way they approach Black Friday–and its younger electronic sibling, Cyber Monday–with many now starting their sales right after Halloween. Accordingly, we decided to do something special for our next Website Showdown: a Black Friday / Cyber Monday edition that pits the ecommerce colossus Amazon against the websites for brick-and-mortar retail mega-stores Walmart and Target. It’s a truly epic battle royale to see how each site performs during the biggest shopping days of the year.
So, as usual, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor all three sites from Thanksgiving through Black Friday and Cyber Monday, spanning from November 23, 2017 to November 27, 2017. We expected strong, reliable performance during the entire run and we were not disappointed. The results were nothing short of impressive.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Usually for this section, we evaluate each site’s performance in detail, drilling down to specific errors each one faced, and the different types of errors we usually see (like Slow Page Files, Timeouts, etc). It’s unusual for the sites in a two-site Showdown to not return a single error, much less a three-site Showdown. In this special evaluation of three sites, not one single, solitary error was found between them. All three sites avoided any kind of failure event or significant error. With the stakes so high for three of the biggest retailers on the most significant sale days of the year, one would expect nothing less. So, with that said, each site earns a perfect score for Reliability.
(Amazon 10/10)
(Walmart 10/10)
(Target 10/10)
Speed
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Sites like Amazon, Walmart and Target boast very graphics-driven designs, and especially with monstrous sale event days like these, the graphics are often big, bold, and frequently changing.
With that said, of Amazon.com’s 5-day run, they saw the fastest day, on average, to be Sunday, November 26th with 4.3 seconds. It’s not the slickest speed a site can have, but it’s certainly not bad. On their slowest day, on average, Amazon still clocked in at 5 seconds on Cyber Monday, which is still not too shabby. When looking at specific times of day for performance, the best hour was at 5AM on Sunday with an impressive 3.4 seconds, while Cyber Monday also saw the slowest hour at 7AM with 6.7 seconds.
(Amazon 9/10)
Walmart.com held their own surprisingly well during this time, too. Their best average day was Thanksgiving Day, November 23rd at 4.2 seconds, just barely edging ahead of Amazon. Their worst day on average was Saturday, November 25th, also at 5 seconds. Finally, their best hour on average was on Thanksgiving at a remarkable 2.7 seconds at 5PM. Their worst time on average was 6.4 seconds at 2AM on Sunday, November 26.
(Walmart 9.5/10)
Last, but certainly not least, Target.com didn’t perform quite as well as the other two, but they still performed respectably, especially considering the fact their site avoided any failure events. Their best day for speed, on average, was Thanksgiving Day at 5.2 seconds, which is worse than both Amazon and Walmart’s worst days. Target’s slowest day on average was Sunday, November 26 at 5.4 seconds, which at the very least, shows a great consistency for the performance of the retail chain’s online presence. Their fastest hour turned out to be on Black Friday at 9AM with 3.9 seconds, with their slowest being on Cyber Monday at 4AM with 7.6 seconds.
(Target 8.5/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
California tends to prove to see the fastest web transaction speeds in the country, and in this test scenario, they once again come out on top for each website. For Amazon.com, the titan of ecommerce saw average load times of 2 seconds in the The Golden State, with their next-fastest location being Texas at 3.2 seconds. When it came to their slowest locations, Illinois came in at the bottom with 6.6 seconds, with Georgia just above them with 6.3 seconds.
(Amazon 9/10)
Walmart.com was only a millisecond faster, seeing an average load time of 1.9 seconds in California, also coming in faster in Texas at 2.7 seconds. But Walmart saw a placement swap for which state was the slowest, with Georgia coming in at the bottom at 6.6 seconds and Illinois right above them at 6.5 seconds.
(Walmart 9.5/10)
Target loaded on average at 2.7 seconds in California, with Texas coming in next at 3.5 seconds. Again, Target’s fastest speeds proved to be slower than their competitors. The slowest average speed that Target saw in the U.S. was also Georgia, at 7.2 seconds, but Washington stepped in as their second slowest, at 7 seconds flat.
(Target 8.5/10)
For usability, we always select a common task a user might typically try to accomplish when visiting the sites we’re testing and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. Like with the most recent Showdown for Lowes and Home Depot, we decided to see what the experience would be like to use these three different websites to add a common product to the shopping cart.
For each of these processes, let’s see about adding the PS4 version of new video game Star Wars: Battlefront II to our shopping cart. To begin each process, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.amazon.com into our Chrome browser, typing “Star Wars Battlefront 2” into the store’s search box and adding it to the cart, it took 30 seconds. From the front page, it took about 5 clicks (including selecting the autocomplete suggestion in the search bar) to get to the final “Place Order” window.
From the point of typing www.walmart.com into our Chrome browser, it took about 4 clicks and 35 seconds to get to the Cart Checkout window. The autocomplete was a little clumsy to deal with (it was tough to tell if the browser was really proceeding to load the site), but overall, it was a decent experience.
From the point of typing www.target.com into our Chrome browser, it took about 5 clicks and 27 seconds to get to the Cart Checkout window.
All three sites were good experiences, although each one has a very different feel. It’s a tough call to say which user experience we found to be better, so we decided to try a second test. This time, we chose something different, like Wonder Woman on Blu-Ray. We also decided to try Mozilla Firefox this time.
The process of finding the Blu-Ray disc and getting to the checkout process on Amazon took about 4 clicks and 25 seconds. The process on Walmart.com took 26 seconds and 5 clicks. On Target.com, it took roughly 24 seconds and 4 clicks. This time, we noticed that in the search results, there’s a convenient “Add to cart” option next to the items on Target’s site that Walmart and Amazon both DON’T have. This definitely gives Target a slight edge over their competitors. And with that being the only real significant difference, outside of its slightly faster completion time, we’ll have to say Target wins the Usability portion of this Showdown.
(Amazon 9.5/10)
(Walmart 9.5/10)
(Target 10/10)
With stakes this high, you would only expect the best from the leaders in the retail industry, so it comes as no surprise that the results were so good and so close. This may be the toughest Showdown we’ve had to score yet, especially with three hats in the ring this time around.
But, with all things accounted for – reliability, speed, geographical performance, and the site’s usability – we’ve reached our verdict:
]]>
10 Ways to Optimize Images to Improve Your Website Performance
by Louis Kingston
“Visuals express ideas in a snackable manner.” – Kim Garst, CEO of Boom Social
Visual imagery on websites is a powerful tool to grab the user’s attention keeping them curious, engaged and interacting on your webpage. Humans are a visual species. Our brains can process an image within 13 milliseconds with over half of the brain devoted to processing the visual information it receives. We show excellent memory capability for remembering pictures that is much higher than retaining text. Over 65% of the population are visual learners. What this means is that our websites must contain a healthy dose of visual images to keep a visitor engaged. Whether it’s on our homepage, service pages, in our blog articles, on our e-commerce sites –images are essential to driving sales, conversions and ultimately company growth.
Are Images Slowing Down Your Load Speed?
However, the images used must be optimized so that they don’t hamper your website’s performance. If they are too large, they are going to slow down your website’s loading speed. The Google algorithm doesn’t like that. More than seven seconds to load and Google’s going to ignore you, and you won’t make it to page one of SERP’s (search engine results page). The search engine’s focus is on organically profiling businesses that offer a great user experience; slow load speed will just have potential visitors clicking away.
Google loves text, and when it crawls your site, it can’t ‘read’ your images unless you have created file names, alt tags, and captions to describe the image. You are losing out on a perfect SEO opportunity if you don’t optimize your images.
Let’s investigate ten ways you can achieve image optimization for your website…
How is Your Website Performing at the Moment?
Of course, these are just ten basic image optimization pointers. You can drill down even further on image optimization to enhance your website performance. If you would like to find out more about your website’s performance, AlertBot can show you what elements are slowing down your site or what bottlenecks are causing user traffic to click away. We also offer a Free 14-day trial (without collecting any billing info). Give us a try!
Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.
]]>
Website Monitoring Leader AlertBot Adds Mac Support for Web Recorder & Enhances SSL Testing Functionality
AlertBot’s multi-step web recorder, which has been available to Windows users for several years and now supports Mac users, is a fast, easy and reliable way to verify that all interactions on a website are working properly.
ALLENTOWN, PA (October 25, 2017) – AlertBot announced today that per a new update it has added Mac support to its acclaimed multi-step web recorder, and has made several other security and usability improvements.
AlertBot’s multi-step web recorder is a fast, easy and reliable way to verify that all interactions on a website are working properly. Customers simply click record, interact with their website as desired (e.g. perform a search, put items in a cart, and so on), and upload their finished script to AlertBot, which then automatically performs these pre-set actions at regularly scheduled intervals. Any variations or concerns are immediately sent to customers for investigation and resolution.
Customers can also re-record their script at any time through AlertBot’s desktop dashboard, or through the re-designed viewer for smartphone and tablets, which per the update is now faster and easier to use.
“We are excited to bring our multi-step web recorder to our Mac customers, which allows them to change their multi-step testing scripts more easily,” commented Pedro Pequeno, President of InfoGenius.com, Inc. which owns and operates AlertBot. “Mac users are an important and valued part of our user base, and we want to make sure they continue to have the best tools available.”
Also featured in the update are new advanced SSL error ignoring and TLF features, which give customers greater control over site diagnostics, and helps them meet PCI compliance standards. For example, customers now can choose how to handle SSL certificate expiration dates, domain mismatches, and other common certificate issues, as well as specify which Transport Socket Layer (TLS) versions to allow.
Other key usability improvements include:
Added Mr. Pequeno: “With the surge in data breaches, PCI compliance standards are more important than ever. AlertBot’s enhanced monitoring capabilities help our customers ensure that the SSL aspects of this compliance commitment are always being met.”
About AlertBot
Founded in 2006, through its industry-leading TrueBrowser® solution AlertBot enables businesses to continuously monitor the availability and performance of their mission critical public Internet services from across the country and around the world. When AlertBot detects an issue with websites or servers, it analyzes the problem within seconds from multiple geographic locations, and delivers real-time alerts to business leaders and system administrators via devices such as smartphones and mobile devices. Thousands of companies trust AlertBot to help them deliver the uptime and performance they expect, and their customers demand. Learn more at http://www.AlertBot.com.
About InfoGenius.com, Inc.
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Learn more at http://www.infogenius.com.
]]>Living in an age where nearly every industry is driven by ecommerce, it should come as no surprise that this includes the home improvement world. Home Depot and Lowes are titans in their industry, and both have a strong online presence. But when it comes to who may have the better performing site, we set out to nail down one true winner.
For our fifth website Showdown, the AlertBot team got out their proverbial measuring tape and slipped on a stylish apron to dig in to the performance of HomeDepot.com vs Lowes.com.
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 11, 2017 to August 31, 2017. Not surprisingly, the performance for these heavy lifters proved to be rather resilient for both sites. Neither service’s site experienced significant downtime, but as usual, one did prove to perform a little better the other.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
HomeDepot.com performed quite well over the tested time period, experiencing no failure events. At most, it had a couple hiccups, like a short-lived Timed Out error or a Slow Page File notice, but none of these occurrences caused any amount of significant downtime. (HomeDepot 9/10)
On the other hand, Lowes’ site experienced one failure event on August 21st when the site was not responding for roughly three minutes around 12:21 in the afternoon. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized outage. In this instance, the error persisted after a few tests in different locations, qualifying it for actual site downtime, before the issue resolved. (Lowes 8/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
HomeDepot.com has a great deal of graphics on the front page, which typically slows sites down considerably. However, it didn’t seem to slow this site down much. HomeDepot.com’s best day, on average, was Tuesday, August 29th with an impressive load time of 1.1 seconds. The “worst” day average was still an impressive 1.9 seconds. When evaluating the site’s speed by hour, the site loaded in just 0.8 seconds at 1AM on Sunday August 20th. The worst hour was also on August 20th, at 2PM with 5.1 seconds. Overall, HomeDepot.com’s speed is quite good. (HomeDepot 9.5/10)
Lowes.com has drastically less content on its front page, but it performed considerably slower than HomeDepot.com did. Sadly, Lowes best day was actually slower than HomeDepot’s worst, with an average of 6 seconds on Sunday, August 13th. Lowes.com’s worst day was Monday, August 26th with 7.1 seconds. That’s not horrendous, but with sites being expected to perform faster and faster these days, a respected retail giant like Lowes needs to up their speed game. On an hourly average basis, their best time was 11PM on Wednesday, August 23rd with 7.1 seconds (Again, their fastest time is slower than HomeDepot’s slowest). Their worst load time by hour was Sunday, August 27th at 1PM with a sluggish 10.1 seconds. (Lowes 8/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
Usually when we look at site speeds across the United States, sites tend to perform better in California than anywhere else. This isn’t the case for HomeDepot.com, however. For Home Depot, Florida appeared to experience the fastest web transaction (less than one second), while it experienced the slowest transaction test in California (But it’s still only 2.3 seconds). After Florida, however, it experienced the next fastest web transactions in New Jersey and North Carolina (both at 1 second). (HomeDepot 9/10)
Lowes.com had the fastest web transaction in California at 3 seconds. The next fastest was North Carolina, already up to 4.3 seconds. The slowest performance occurred in New York at a whopping 9.4 seconds (with the second-slowest being Georgia with 9.3 seconds). (Lowes 7.5/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. For this Showdown, we’ll see what the experience is like to use their respective websites to add a common product to the shopping cart.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.homedepot.com into our Chrome browser and entering “leather gloves” into the search box, choosing one and adding it to the cart, it took 25 seconds. From the front page, it took 5 clicks to get to the “Checkout now” process. It wasn’t bad, but we found the Lowes process just a bit smoother.
From the point of typing www.lowes.com into our Chrome browser, it took 4 mouse clicks and 20 seconds to get the gloves into the shopping cart and view the cart. The “Add to cart” button is much more obvious and visible on Lowes’ site, where it took a moment to locate it on Home Depot’s site. And while both sites offer a “compare” option so you can look at product features side by side, it wasn’t very noticeable on HomeDepot’s site, while it was more prominent on Lowes.com.
The aesthetic of both websites isn’t bad, but Lowes has a crisper and more streamlined appearance and functionality. Both sites get the job done pretty quickly, but we had a slightly smoother experience with Lowes. With that said, here are the Usability scores:
(HomeDepot 9/10) (Lowes 10/10)
Both sites performed respectably, but HomeDepot.com clearly performed faster and was more reliable than Lowes.com. Despite the fact that we may have preferred the shopping experience on Lowes.com just a little bit more, one cannot ignore the slower site performance.
So, for the fifth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx, and Burger King is…
Whether you’re picking up a Kids meal for your littlest picky eater or satisfying a hankering for greasy and salty French fries, chances are you’ve found yourself in line at a drive-thru for McDonald’s or Burger King at some point in your life. But these two massive burger chains also have an online presence, and while you’re not exactly going to try to order a single or double patty to be shipped to your home, you might find yourself visiting the websites for either fast food giant to look up their menus or latest promotions.
So for this, our fourth website Showdown, the AlertBot team rolled up their sleeves, grabbed a handful of ketchup packets, and sat down to take the wax paper wrap off of these two websites to see just how the sites for BK and Mickey D’s performed in comparison to one another.
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for three weeks, spanning from June 5, 2017 to June 26, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but as usual, one did prove to perform a little faster than the other.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
Both sites performed quite well during the time period, but McDonald’s site experienced a hiccup on the first day of the test, June 5. It was a timed-out warning (meaning the site failed to load in the expected time period), but it didn’t last longer than a couple minutes, and didn’t seem to affect the site for very long. Otherwise, their site was pretty stable. (McDonald’s 9/10)
On the other hand, Burger King’s site didn’t experience any confirmed failure events at all and experiencing complete uptime during the test time. However, it did see two transient errors—one a slow page notice and one a brief timed-out notice—for less than a minute that affected the site’s overall performance from a single location. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized blip. In these instances, the error only occurred from just one test location and didn’t qualify as a downtime event. (Burger King 9.5/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Both sites are quite graphics-heavy, so it doesn’t surprise me that they may experience some slowness at times.
McDonalds’ loading speeds averaged around 9.5 seconds per day, with its best time being 10 AM on Monday, June 12 at 5 seconds and its best day being Monday, June 26th with an average of 8.8 seconds. Its worst day was Monday, June 5th, when the load time crawled to an average of 12.7 seconds, while the worst time was on Wednesday June 7th at 11 PM with a pitiful 17.6 seconds. (McDonald’s 8.5/10)
Burger King performed significantly better by comparison. Overall, the site averaged 3.6 seconds for its load time, which is pretty good. Its best day was Wednesday, June 19th when it averaged 3.5 seconds, with its best load time being on Wednesday, June 14th with a speedy 1.8 seconds load time at 6 AM. Monday, June 5 was the worst day, seeing a 6.1 seconds load time (which was still better than McDonald’s BEST day), and their worst time being Saturday, June 17th at 10 AM with 8.5 seconds. (Burger King 9.5/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
It seems to be the norm for California to record the fastest speeds, and the same holds true for McDonald’s. However, surprisingly, New Jersey was the next fastest state on the list. Comparatively, the fast food chain legends saw the slowest load times in Georgia and Utah. (McDonald’s 9/10)
Burger King, for the most part, saw stronger returns across the board, with California, Colorado, Virginia, Missouri, Washington and Texas all pinging approximately 1 msec. Their slowest locations were North Carolina and also Utah. (Burger King 10/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we tested out how the experience of tracking a real package might look when using two popular parcel services. For this Showdown, we’ll see what the experience is like to use their respective websites to look up the menu and nutritional information on each company’s signature burgers.
For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.
From the point of typing www.mcdonalds.com into our Chrome browser and navigating until we could find the Big Mac nutritional info, it took 26 seconds. We were held up at first by a prompt on the front page that asked us to join their email list. The browser also wanted to access our location. From closing out the pop-up on down to finding the Big Mac info, it took five mouse clicks.
Now, from the point of typing www.burgerking.com into our Chrome browser, it took four mouse clicks and 18 seconds to get to the Whopper’s nutritional info. BK’s design is much simpler, so we see why their load times were faster.
We liked the aesthetic of both websites, but McDonalds has a slightly more modern feel in its design. However, their graphics are all-around larger and they have more going on on the page, which could be why their overall load times are slower than Burger King’s.
So, with all things considered, with the goal being able to find the nutritional info on each chain’s most popular burger, here are the Usability scores:
(McDonalds 9/10) (Burger King 10/10)
Neither site performed exceptionally well over the other, but it’s safe to say that Burger King edges out McDonalds in speed and overall performance. (Just for fun, we should follow this up with a who-has-the-better-French-Fries competition!)
So, for the fourth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx and Fandango is…
Tortoise, Dinosaur or Ostrich?
Proactive vs Reactive Web Monitoring – 3 Metaphors From the Animal Kingdom
by Penny Hoelscher
In February 2017, Amazon Web Services’ (AWS) S3 web-based storage service suffered an outage that led to half of the internet “melting down” and costing businesses millions. It was caused by an operator’s typing error when issuing a routine command to take a few S3 servers offline.
What has this got to do with you?
Despite the fact that the entire outage lasted 4 hours and 17 minutes, Amazon came under attack from experts and customers in toe-curling global headline news. AppleInsider reported that even Apple was affected, with a variety of cloud services experiencing outages and slowdowns. Apple relies on Amazon for portions of its cloud infrastructure. Albeit not as a result of the meltdown, rumor has it the company is thought to be gradually shifting away from its dependence on Amazon.
Perhaps you’re not an Amazon or an Apple, but you too may be vulnerable. It all boils down to reliability which has a direct affect on your revenue stream. If your web application or site delivers poor performance, your customers will go to your faster, more modern, more customer-centric competitors where they experience less downtime, fewer outages and faster page loading times, and better service. The result: you will lose sales, money and even your reputation.
How can you tell that it’s time to upgrade your website monitoring tool and get expert assistance? Well, you’ve already dropped the ball when you start noticing a decline in visitors; when once a waterfall, the stream of traffic to your website has slowed to a trickle. An external website monitoring tool like AlertBot can alert you to potential signs of trouble, like:
In a nutshell, if your website persona resembles one of the following – tortoise, dinosaur or ostrich – you’re in trouble:
TORTOISE: Outages, high down-times and slow loading times
The internet is not like your local shopping mall which is a convenient one-stop shop for all your household needs. These days, “I want it and I want it now” customers have far more options and if you’re closed for business, they’re not going to go and have a cup of coffee and wait for your door to open again; they’re simply going to mosey over to your competitors. Only one thing hasn’t changed in the digital sphere: some old adages hold true. Thing is, customer loyalty is a fair weather friend in an online environment, and when it comes to affiliate loyalty, frankly, for them, time is money.
Website monitoring tools not only report on outages and high down-times, they help you to identify where (e.g. a particular geographic location), when (e.g. peak hours) and why (e.g. network issues) these are occurring. You may find it is your business model that is at fault, not slow servers or bloated software; for instance, perhaps you’re doing maintenance and performing upgrades at the wrong time in a different time zone to that of your head office.
In addition, page loading speed is one of the ways Google ranks your web pages. This matters because when searching for products and services, customers will click on the matching businesses Google serves first.
DINOSAUR – Being behind the times
Google lowers mobile page rankings for companies who do not have a mobile responsive web design. New website design trends have changed the face of online businesses and today’s tech-savvy generation can spot an old-fashioned, un-cool design in a heartbeat. But, keeping up with new design technologies can have an impact on your website’s performance. Page bloat is much like a beer belly; extraneous code, affiliate advertising and toxic data (storage of unnecessary and dated information) creeps up sneakily but has a huge impact.
One of the main benefits of a professional website monitoring service is that it provides you with an automated artificial intelligence that can manage big data and learn from the information it receives. You don’t have to wait for users to complain or continuously test the site yourself, and, because your business is constantly evolving, it is able to update its algorithm in tandem. These sophisticated technologies not only gather and analyze the data you need to make an informed decision about performance, they provide you with the solutions.
Cyber attacks are a 21st century bane to which all online businesses – big and small – are vulnerable. Of increasing concern is that at many companies, it can take months before a data breach is detected, giving cyber criminals plenty of time to ravage their victims’ systems. AlertBot can’t prevent a data breach but it can alert you when you’re attacked, e.g. by notifying you that files have been changed or your site has inexplicably gone down.
OSTRICH – Customer complaints
Negative social media posts can be harsh on a business’s reputation. Often, it may appear unfair, especially when the trolls join the battle to bring you down. Sure, you need a team to monitor social media channels and publicly appease customers (including the trolls) who have issues, but that’s not enough. An external website monitoring service can give you advance warning of problems with your system.
Customer Experience (CX) is not just about the latest trends – mobile first, conversational brands, emotional engagement, predictive analytics and personalization, etc.; CX is about serving customer needs and wants (read: demands) BEFORE they start complaining. Once your website starts exhibiting dinosaur or tortoise characteristics because you’ve been acting like an ostrich with its head in the sand, it is too late; all you will have is reminders of your ex-customers’ public vents still floating around on complaints forums and social media channels.
Conclusion
The Amazon debacle should be a wake-up call for businesses to be more proactive with regard to monitoring the uptime and infrastructure of their systems. Imagine how red your company’s face would be if you don’t notice a crisis before your users do and you have to be informed by irate calls and emails from them.
A monitoring tool like AlertBot simulates actual user behaviors and interactions, and runs tests using popular web browsers like Chrome and Firefox in real-time. It’s easy to set up (no installation necessary) and allows you to create scripts for different user experiences across multiple devices, using multiple features and functions, enabling you to be proactive at the best of times, and timeously reactive at the worst (after all, accidents do happen.)
]]>
One of the most appealing things about ordering items online is receiving packages in the mail. Not only is it convenient for the fruits of your shopping toils to be brought directly to your door, but you can do your shopping from anywhere at any time of the day or night (and in your pajamas if you so desire). Two well-known, worldwide services that nearly everyone who has sent or received a parcel has used are UPS and FedEx. Both services are easily accessible for sending packages, and both are frequently used for receiving them. Both services also have websites that enable users to track their packages (if they’ve been given a tracking number), while also helping to provide resources for sending them out.
For our third Showdown, we set out to track the performance of these two services, trucking along until we could wrap up the results for delivery to you.
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both parcel service sites for three weeks, spanning from March 27, 2017 to April 17, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but one did prove to perform a little faster than the other.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.
FedEx’s website experienced not a single, solitary failure event. At the very worst, it may have experienced some slight slowness for a short period of time, but it didn’t affect their overall reliability results. (FedEx 10/10)
UPS’s website was a different story, but there were also no failure events or periods of actual downtime either. The most UPS’s site saw were a handful of warnings that the site was performing a little slower than usual, and a little slower than the average expected load time. These periods of minor slowness only lasted for about 3 to 5 minutes each. (UPS 9.5/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Both websites have pretty basic homepages, so the load times for customers should be fairly quick (even on a slow internet connection) if the sites aren’t experiencing any server issues.
FedEx’s site speed is fantastic, averaging less than 1 second on most occasions. Its response time was recorded on Wednesday, April 5, 2017 at 0.5 seconds, while its slowest response time was on Monday, April 17, 2016 at just over 2 seconds (which is still very good). (FedEx 10/10)
UPS was also pretty good, but their best response time was about the same as FedEx’s worst response time. UPS’s best response time was 2 seconds on Tuesday, April 11, while their worst was on Monday, April 10th with just a hair under 6 seconds. The standard used to be 7 seconds, but these days, users expect sites to load in roughly 2 seconds. (UPS 8/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.
It’s also interesting to note that in most of these tests we’ve done for these Showdowns so far, California seems to frequently come out on top when it comes to website speed. With that said, FedEx seemed to perform best in California – at just under half a second, and performed the “worst” in Virginia, which still averaged around an impressive 1.1 seconds. (FedEx 10/10)
UPS also saw its best results in California, but clocked in at around 1.4 seconds there. Texas returned the slowest results, however, averaging around 5.6 seconds. (UPS 8/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we went through the motions of ordering tickets for a recent movie on MovieTickets.com and Fandango.com, For this evaluation of FedEx and UPS, we’ll see how the experience of tracking a real package goes.
For each package tracking process, we started with having the tracking number copied onto our clipboard and then typed the URL of the test site into our browser.
From the point of typing www.FedEx.com into our Firefox browser, selecting the tracking tab at the top, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took only 15 seconds to get to the tracking results. That’s really fast! We then tried the same process again using the Google Chrome browser, for which the “region” needed to be selected first this time, it took only a second longer to complete!
Now, from the point of typing www.UPS.com into our Firefox browser, selecting the region, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took roughly 22 seconds to get to the tracking results. That’s not bad, but it’s clearly slower than our FedEx experience. We then tried the same process again using the Google Chrome browser and it took an impressive 12 seconds to complete!
So, with all things considered, with the goal being to track a package as quickly as possible, here are the Usability scores:
(FedEx 10/10) (UPS 9/10)
It’s a close match, to be honest, but we’d have to say that FedEx.com still outperformed UPS.com in the speed factor, delivering more than just highly anticipated parcels to its customers, but reliable website performance swiftly as well.
So, for the third AlertBot Showdown, the site that gets to join the ranks of previous winners Apple and Fandango is…
> ]]>
Most companies take advantage of third party website monitoring services to monitor their websites 24/7 for performance issues and downtime. These services alert them immediately when problems arise, equipping them with the necessary knowledge to pinpoint the problem so their team can resolve it.
Companies rely on their website for many things. Whether their website is used to generate leads, drive business, or keep customers engaged, essential processes and pages on their website are often the lifeblood of their business and online presence.
In the same way that a routine doctor or dentist appointment evaluates your health and checks for any potential impairments or issues that need improvement or fixing, using website monitoring to routinely check your site’s performance is crucial to the success of your company’s online presence.
Here are some important processes and webpages to evaluate and monitor on your website:
Your landing page is the page that is supposed to hook your visitor, draw them in and get them interested in your product or service. Making sure these pages are always reachable by potential new customers is of utmost importance. It may seem like a no-brainer to monitor this vital page, but a lot of people who own small businesses do not think to apply website monitoring to their landing pages.
Once the user gets past your landing page, they become keenly aware of your website’s speed; particularly if it’s sluggish. With the competition being fierce, one of the major website processes to monitor is each of your page’s loading speed. You cannot afford to have a home page that takes 10 seconds or more to load. The new generation of internet users is not patient enough to sit through a sluggish download or stare at a spinning “loading” icon. If you have a page that takes time to load, you may need to make some design alterations, incorporating minimalistic design that is both attractive and loads faster. A lot of web designers have taken this into account and have adopted new techniques to make the webpages load faster while retaining a fresh and respectable look. Website monitoring can help you identify if your page load time is negatively affecting your bottom line.
Monitoring your website traffic and performance from different countries is extremely important. Knowing where most of your customers come from and enhancing the performance from that geographic area the most can make all the difference for your business. If you cater to a certain state or province, then monitoring the specific geographical location or district that fuels your business is recommended.
E-commerce driven websites must monitor their shopping carts very closely. For example, if a customer placed products in a cart but did not buy them, it could mean that there are issues with the checkout process. However, if you were not monitoring your cart, you would never know about it and might just assume they lost interest. Poor shopping cart performance will directly affect your company’s sales, which makes monitoring your shopping cart processes that much more important.
Any page on your website that prompts a customer to sign up or register for a service needs to be up and running 24/7. Statistics show that in cases where the signup pages of a website are not working optimally, visitors often abandon the signup process due to a loss in confidence. Since these pages are directly involved with registering new customers or providing new service to existing customers, they are some of the most crucial to monitor on your website.
Customer frustrations over not being able to access members-only areas of your website can cost you not only customers, but also support hours dealing with the problem. Getting ahead of the problem by monitoring these areas can save your company a lot of time and money.
These are just some of the top areas of your website to ensure are running smoothly 24/7. Start monitoring your most crucial pages today with a no-risk, 14-day FREE trial of AlertBot and start saving your company time, money and unnecessary headaches.
]]>Key customer-features of AlertBot’s new website include responsive design, improved UX, intuitive navigation, new content and more.
March 27, 2017 – AlertBot, a leading provider of enterprise-class server and website monitoring solutions, announced today that it has launched a completely redesigned website at www.alertbot.com.
“As a leader in website performance monitoring, we know how important is to stay relevant and up-to-date with the latest technology and trends,” commented Pedro Pequeno, President of InfoGenius.com, Inc. which owns and operates AlertBot. “Our new website is the result of months of planning, development and testing. We are proud that it continues our tradition of quality and customer-focused updates that help make AlertBot so essential to our growing roster of customers worldwide.”
Key customer-focused features of AlertBot’s new and improved website include:
Added Mr. Pequeno: “Since launching our new website, the feedback we have received from current and new customers has been incredibly positive. We look forward to enhancing and adding new features in the months ahead!”
About AlertBot
Founded in 2006, through its industry-leading TrueBrowser® solution AlertBot enables businesses to continuously monitor the availability and performance of their mission critical public Internet services from across the country and around the world. When AlertBot detects an issue with websites or servers, it analyzes the problem within seconds from multiple geographic locations, and delivers real-time alerts to business leaders and system administrators via devices such as smartphones and mobile devices. Thousands of companies trust AlertBot to help them deliver the uptime and performance they expect, and their customers demand. Learn more at www.AlertBot.com.
About InfoGenius.com, Inc.
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Learn more at www.infogenius.com.
]]>To become competitive in the global market, it’s crucial for your business to have a strong online presence. One of the best ways to ensure this is to have a user-friendly business website that is accessible ’round the clock. And if your customers rely heavily on your website, you know that any amount of time your site is down could be rather costly.
Frankly, website downtime is inevitable. Even the big online giants like Microsoft, Google, Facebook, eBay, YouTube, Amazon and CNN have experienced website downtime at some point. However, the good news is that you can mitigate the risk and lower the length of time your site remains inactive if you are familiar with some of the likely causes of website downtime.
Let’s dig a little deeper to find out the common causes of site downtime:
Server overloads occur when a big wave of online traffic overwhelms a server. Now, there are two situations when this happens. First, it happens if your site is being hosted on a shared server. Resources on shared servers are limited and they have to be stretched to support high volumes of traffic and site-processing needs, which can cause server overload. As a result, your site may be inaccessible to users for hours.
Second, server overloads may also happen on major online shopping days, like Black Friday and Cyber Monday, or any other occasion for that matter, when you have significant discount deals and special sales running on your website. Such deals draw in heavy traffic, thus increasing the chances of server overload and site downtime.
Server and network failures can bring a website to a screeching halt in no time flat. This could be caused by things like hard drive failures, power supply failures, circuit board failures, or cabling failures. It can also be caused by more troubling failures like data center infrastructure failures or network peering failures.
Your business may experience downtime because of errors caused by the site’s webmaster. For example, your site may not be accessible to your audience if your webmaster forgets to renew the site’s hosting contract or domain name.
Some common coding errors are incorrect syntax, infinite loops and typos. All of these errors can exhaust the resources of the server and yield 500 (Internal Server) error codes, resulting in website downtime.
With the surge in cyber crime, you need to make sure that your website is well-protected from cybercriminals, hackers and viral infections. Cybercriminals know how to hijack websites and redirect your site visitors to other websites or expose them to malicious content.
All of this can result in lengthy website downtime, which can be detrimental to your business sales, profits and reputation. And that is definitely something that no business owner wants! One way to help prevent cyber attacks is to keep your IT team, and those directly responsible for the health of your website and server, in the know about the latest cyber threats.
Also known as DDoS, Distributed Denial of Service Attacks can also bring your online business to a standstill. DDoS are planned attacks. In these instances, heavy traffic is deliberately directed from different sources to cause servers to overload and, in some cases, crash entirely.
Website downtime may also occur when your data center is hit by a natural disaster like floods, hurricanes, earthquakes, fires, etc.
Lastly, if you have a dedicated server, you may need to go offline for server maintenance. This usually involves upgrading hardware components, drivers, operating systems, firmware, and even software applications. With these planned occurrences, you can alert customers ahead of time to the planned outage, which can help combat and minimize the effect it may have on your business.
Knowing the reasons for, and causes of, website downtime is crucial as it will help you devise and implement the right mix of strategies to overcome and avoid it.
AlertBot’s external website monitoring service exists to help businesses like yours to identify and fix website errors when they happen and hopefully prevent future downtime. Visit www.AlertBot.com for more information and to signup for a free, no-risk trial.
]]>For our second Showdown, we decided to grab an oversized bucket of popcorn, an unreasonably large cup of soda, and a pair of cheap, plastic 3D glasses and plopped down into the comfiest of chairs to evaluate two of the premiere movie ticket buying sites: Fandango and MovieTickets.com.
If you’re a movie buff who loves a night out reclining in front of ceiling-high silver screens to watch the latest Hollywood has to offer, chances are you’ve purchased tickets online before. And what would be more frustrating than website failure while you’re trying to combat the masses to secure your entry into an anticipated film’s opening night?
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites from December 26, 2016 to January 16, 2017. Not surprisingly, the performance proved to be pretty good overall, although one of the sites experienced some pretty significant issues on one of the days. Both sites saw some minor “Slow Page” warnings, but MovieTickets.com took a hit right after Christmas with a dreaded “Server Too Busy” error, meaning their website couldn’t withstand the weight of the traffic it was getting.
For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for the causes of those failures.
Fandango’s website experienced not one single failure event. The worst things seemed to get for Fandango in this time period was a handful of minor “Slow Page” warnings on Dec. 29th and after the new year on the 2nd and 5th. (Fandango 10/10)
Meanwhile, MovieTickets.com experienced what AlertBot considered to be 13 failure events. While most of them were only 3 to 5-minute-long for slow page loads, on Dec. 27th (a Tuesday), the site saw some significant outages where the site was down and reporting a “Server Too Busy” error for nearly 6 hours! When visiting the site during this stretch of time, chances are users were met with the dreaded “Server Too Busy” error message in their web browser instead of the actual site. This would have been super frustrating for visitors (especially if you’re trying to order tickets in a jiffy). (MovieTickets.com 7/10)
When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Both websites have pretty busy front pages, but both tend to change often and feature videos or Flash-driven ads and some graphic-heavy content – all of which can really compromise a website’s load time.
Fandango’s speed is solid, averaging less than 2 seconds. Its fastest day was Friday, Jan. 13, 2017 at 1.7 seconds and its slowest was Thursday, Dec. 29, 2016 at just over 2 seconds. (Fandango 9/10)
MovieTickets.com didn’t fare quite as well, unfortunately. On its best day, Tuesday Jan. 3, the front page took almost 4 seconds to load. On its worst day, Dec. 27 (also a Tuesday), it took almost 7 seconds to load – which is definitely below the current online industry’s website load time standards. (MovieTickets.com 7/10)
It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. Fandango saw the fastest load times in California (I suppose that makes sense, given the movie industry being centered there), with the slowest happening in Texas. Still, the slowest times were typically still under 2 seconds of load time, with Washington, Virginia and Texas all seeing load times around 2 seconds or pushing 3 seconds. (Fandango 10/10)
For MovieTickets.com, it’s a different story. We already know they struggled with speed, but the question here is – where? California is also the best location for MovieTickets.com, with load times around 2.7 seconds. The worst, again, is Texas, with almost 6.5 seconds. Florida and North Carolina also performed well, while Washington joined Texas as one of the slower locations. (MovieTickets.com 8/10)
For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we chose the task of ordering the latest cellphone from the respective sites of Apple and Samsung. For these sites, we’ll see how the experience of ordering movie tickets compares to one another.
Starting with selecting a movie to buy tickets for, we approached each site with the goal of ordering two tickets for the recently released The LEGO Batman Movie.
By selecting a brand new film, it was easy to find the film on the homepage of Fandango.com and start clicking through to order tickets. From the point of typing http://www.Fandango.com into our Firefox browser, clicking on the LEGO Batman Movie poster, putting in our zip code, selecting the next available time and number of tickets, it took roughly 40 seconds to get to the Fandango checkout. That’s not bad. If this were for real, we would have probably spent extra time checking our show time options a bit more, choosing 2D over 3D, etc. But for this task, we figured it’s best to keep it simple. The whole process took about 4 clicks of the mouse with a little typing to put my zip code in.
For MovieTickets.com, we found the experience to be mostly the same. Except, when we put our zip code in, it seemed like MovieTickets.com gave more options right off the bat. Fandango suggests the closest theater for your zip code and the first batch of showings it finds (in this case, it’s a 3D showing), while MovieTickets gives you the full list of showings and format options. Our experience felt more thorough with MovieTickets, getting more choices right away. But we also feel like more options delayed our browsing experience because we had to read and think more. Still, the browsing time for MovieTickets.com – to complete the same process – was the same 4 clicks and around the same 40 seconds.
We tried both again from Google Chrome, and not even factoring in our newfound familiarity with the process for both sites, we found both processes to take 30 seconds each this time. They’re easy sites to navigate and their load times were swift.
So, with all things considered, with the goal being to get tickets for one of the most recent films released ordered, here are the Usability scores:
(Fandango 9/10) (MovieTickets.com 10/10)
We’d say Fandango won by quite a bit, given their better web performance over MovieTickets.com, but I think I enjoyed the usability of MovieTickets.com over that of Fandango’s. The fact that Fandango doesn’t present show time options upfront is a little unfortunate.
Still, one cannot ignore good web performance, and I have to hand it to Fandango for achieving impressive site speed and reliability. So, with that said, the result of the second AlertBot Showdown is…
So, your business website is offline again and your IT team has sprung into action, trying to pinpoint the issue and fix it as soon as possible. Sure, it’s good that your IT experts are handling the problem responsibly, but do you know how much money your business may have lost during your website’s downtime? Well, if you are a major player in the ecommerce industry, chances are you could have lost millions of dollars by now. And that is not an overstatement.
Like it or not, even an hour of downtime can do a great deal of damage to your online business. Did you know that in 2014, Google experienced downtime which was caused by a virus and all Gmail, Google+ and Google Drive were affected by it? This downtime lasted for an hour, which decreased Google stocks by 2.4 percent.
But that’s not all! Amazon, the e-shopping giant, experienced 2 hours of downtime, presenting site visitors with cryptic HTTP messages. In just 2 hours, Amazon lost an estimated total of $3.48 million. That’s huge!
So, if you wish to estimate the true cost of an hour of website downtime has to your business, then you’ve come to the right place. Here are some of the more important variables you must consider when calculating this cost:
To figure out exactly how much an episode of website downtime costs in terms of sales lost, you’d need to determine what your average profits per minute are during the time period the downtime occurred. You can then multiply that average profit per minute times the number of downtime minutes to determine your total lost sales profits. If the downtime occurs at 2 in the afternoon, for example, it is most likely going to cost your business more sales than if the outage had happened at, say, 2 in the morning, when web traffic is typically much lighter.
Downtime (especially if it’s frequent or at a crucial time) can scar your business’s reputation, losing the trust and loyalty of customers in your brand. Just like many businesses, you too have invested good money and a great deal of time in brand building. Your time and money can go to waste if you experience downtime—even if it is for just an hour. When considering the true cost of your site’s downtime, it is important that you keep in mind the resources you’ll need to spend to repair your tainted brand image going forward.
Another factor to consider when determining the cost is the money you have invested in your marketing efforts, like PPC (pay-per-click) campaigns. You need to figure out the amount of money that was spent on marketing while your site was experiencing downtime. This is important to calculate, because let’s face it – you literally didn’t reap any benefits from the invested money, because your site was inaccessible when prospects clicked on the PPC link or advertisement.
Calculating the cost you might have incurred due to an hour of website downtime is essential, but there are precautions you can take to avoid unplanned downtime and keep your business up and running ’round the clock (and be a hero!). AlertBot is an intuitive web-based website monitoring service that can alert your team about website errors and slowness within seconds, and also help you keep track of your site performance. All of this is much needed to mitigate downtime issues significantly. Start the AlertBot 14-day free trial today!
]]>Given the fact that we live in a highly-digitized world today, websites, blogs and web-stores are now an essential component of any business and brand. While waiting for a site’s content to load can be annoying for a user, it can also be potentially disastrous for business.
That, however, is only one reason to monitor the performance of your website. Here are four more:
First and foremost, businesses maintain websites and have web-stores to promote commercial growth. Now, imagine a situation where you’ve gone to a store and the service is impossibly slow. The salesmen and women are hardly making an effort to engage or help you and you just decide to take your business elsewhere. The same happens to a shopper when they visit a website that takes ages to load. Instead of making a sale, you lose web-traffic and potential customers. You can prevent this by monitoring how your website is performing.
Customers talk, and they are interested in what others like them have to say. While most brands depend on marketing ploys to promote sales, the importance of word-of-mouth advertisement cannot be discounted. If you leave a bad impression on one customer, chances are that word will spread about it, tainting- if not tarnishing- your hard earned reputation and brand image. Who wants that?
Website performance monitoring is the best way to prevent errors. It’s all too common for ecommerce sites to hit a snag and run into trouble. If your site is regularly maintained and monitored, you’ll not only be able to fix a problem sooner; you might even be able to detect it beforehand and prevent it completely.
Just as quality assurance is essential for a physical store, it’s equally important for a website and web store. By using a performance testing and maintenance tool, software or application, you will be able to standardize and retain the quality of your website. Not only will that help preserve the website’s ranking on Google, it will also contribute to drive online traffic. As it is, Google ranking is affected by the minutest change in website speed and downtime. This is the whole reason why websites are search engine optimized in the first place.
So, if you’re even partially convinced that your website needs performance monitoring, why not start the AlertBot 14-day free trial, today?
]]>If you’ve ever tried to buy tickets online for an event – whether for a popular Broadway play, a Las Vegas event, a concert in a local city or even a popular science fiction blockbuster movie months in advance – chances are you’ve struggled with being able to obtain tickets at regular price. Part of the problem is that many online scalpers have perfected the art of snatching tickets up as soon as they’re available to the public with the use of bots. Fans can breathe a sigh of relief, however, because just this week, President Obama signed The Better Online Ticket Sales (BOTS) Act to help combat this.
It’s unfortunate that this online ticket scalping problem has become so rampant that it’s taken governmental action to try to put an end to it. That blows our minds. But, in our opinion, the BOTS Act should be good for the consumer (and it’s about time!). It’ll allow for more tickets to be available at face value for consumers online than ever before. Event ticket vendor Ticketmaster commented yesterday, “On behalf of artists, venues, teams, and especially fans, we applaud the BOTS Act being signed into federal law.”
The BOTS Act won’t be good for everyone, though. For consumers with fatter wallets who’ve grown accustomed to purchasing tickets last minute, and at inflated prices, the availability of those second-chance tickets may now be slim to none.
The BOTS Act doesn’t really impact AlertBot, but if someone had been considering the creation of an automated script on our system for use in buying highly sought after tickets as soon as they come on sale, that won’t be possible anymore. That’s not a use case we’ve ever thought of before, but we suppose it was a possible scenario. As of this week, however, the AlertBot team will not be allowing any scripts of this nature to be used on the AlertBot system (and as far as we know, there have never been). However, if you work for a ticket sales website and have been thinking of monitoring your own ticket purchasing pages and shopping cart, you can still do that with AlertBot. That’s exactly what AlertBot is meant for, and the BOTS Act provides an exception in the law to allow online ticketing operators to use bots to monitor and test their own systems for flaws. That’s a good thing! As we know, some of these ticket vending sites could use some improvement.
In mid-July of this year, the BOTS Act was first presented to the Senate and, on December 7th, the House passed/agreed to it without objection. It was then presented to the President on Monday this week, which he then signed. This Act, which applies to online sales for any public event exceeding 200 people, was the first of its kind pertaining to online bots and can be viewed in its entirety here.
]]>Allentown, PA / December 13, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading TrueBrowser®-based web application monitoring solution, AlertBot, is pleased to announce the launch of a new series of AlertBot blogs the team has dubbed ‘Website Showdowns.’ AlertBot’s Showdown blogs will feature monitoring results from competing websites, showcasing AlertBot’s TrueBrowser® technology at work, which combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.
The AlertBot Showdown blogs will evaluate each website’s performance based on four categories, including reliability, speed, geographical performance and usability, complete with time-based trends and detailed analytics.
This month’s scrimmage pits rivals Apple.com against Samsung.com. With two titans of industry like these going head to head, the results were, for the most part, not unexpected. Read the full report here.
AlertBot continues to remain on the cutting edge of website performance. With 85 Global Test Locations operating over 7 Internet Backbones developed during the past decade, AlertBot has established their reputation in real-world private industry applications. AlertBot serves over 10,000 users spanning 6 continents worldwide with 200 million website checks per month. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine.
About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.
About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.
If website performance is important to you, then you’ll know just how vital it is to the success of your business’s website. To AlertBot, web performance is everything. This topic is of great interest to us, as we live and breathe web performance on a daily basis. It got us thinking – we all love a good head-to-head, mano-a-mano rivalry: Tyson vs Holyfield. The Hatfields vs The McCoys. The Jets vs The Sharks. Prego vs Ragu. Luke vs Vader. So we thought, what if we tracked the performance of two websites within a certain genre and pit them against each other. Who has the better website performance? Who will come out on top?
Every Fall, Apple releases a new iPhone like clockwork. But Apple isn’t the only game in town. With Apple celebrating the recent release of the iPhone 7, Samsung has their Galaxy S7 (which released in March). So we decided it was fitting to have Apple.com go toe-to-toe with Samsung.com. The results were not unexpected. (Well… most of the results.)
When you have companies as serious about their products and innovation as these two, you’d expect their websites to perform impeccably. And, honestly, they did.
We tracked the sites and examined three weeks in September – the 1st through the 22nd – to see how these sites performed. During this timeframe, we tested the websites around the clock from 17 different locations across the United States using AlertBot’s TrueBrowser Monitoring. The tests were performed by loading their homepages inside real Firefox browsers and giving them a maximum of 7 seconds to render and become fully interactive. Anything beyond 7 seconds (which is well above the average expected page load time) was considered a failure. After compiling all the data, this is what we found:
When we examine the reliability of a website, we’re looking for failure events – like when pages don’t fully load or go down completely – and try to identify the cause of the failure. Some common causes are slow third-party code used on pages, incomplete page content, actual web server failures, etc.
For Samsung, their website experienced no failure events during our test period, and achieved 100% uptime. This is definitely above the norm for website performance, but not unexpected for a company like Samsung. We would have loved to find some juicy failure-generated data to talk about, but Samsung’s website was as clean as a whistle on this front. (Samsung Score 10/10)
Similarly, Apple.com experienced no failure events and achieved 100% uptime. While I’d expect nothing less from a juggernaut like Apple, it’s still impressive when you consider other retailers that experience frequent website issues. (Apple Score 10/10)
When we evaluate a website’s speed, we’re looking at the time it takes the site’s homepage to render and load to the point of being fully interactive. We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
While evaluating the speed of the websites specifically, Samsung.com’s fastest day was Friday, Sept. 2nd, with its slowest day being Saturday, Sept. 3rd. On average, the site’s homepage took around 1.7 seconds to load. That’s not bad at all! Some recent studies have shown that the expected load time for sites in ecommerce to be 2 seconds or less, so Samsung definitely fits the bill here. Some online studies have determined that if an ecommerce site is making $100,000 per day in sales, just a 1-second page delay could potentially cost the company $2.5 million in lost sales per year. (Ouch!) On its slowest day (Sept 3rd), Samsung.com saw some load times in the range of over 7 seconds at times during the day. (Samsung Score 9/10)
While evaluating Apple.com’s speed, its fastest day was also a Friday, on Sept. 9th, with its slowest day being a Friday, Sept. 2 (coincidentally, the same day Samsung experienced its fastest load time), in which the site took 10 seconds to load at times (due to a slow page file error). However, on average, the site’s homepage took around 1.3 seconds to load. It’s a hair faster than Samsung’s, but they’re close to each other. (Apple Score 9/10)
One major mistake a lot of websites make is utilizing large graphic file sizes or third party code on their home page, and it’s things like that that can really bog down a website’s speed. It’s not surprising that both Apple and Samsung avoid this mistake. While both of them display large, beautiful images on their front page, they optimize their file sizes well.
When we looked at Samsung.com’s performance at various locations around the United States, we found that the site consistently took longer to load in Texas, with its slowest time occurring in Washington, DC, but was the fastest in Florida, North Carolina and Georgia. Samsung.com had just a handful of minor site hiccups during this three-week period, but only at specific locations. For example, AlertBot registered 5 instances of slower load times: once in New York, twice in Florida, once in Washington DC and once in Washington state. Still, it managed to perform more than adequately at these locations overall. It wouldn’t be uncommon for websites to experience significant trouble in certain areas of the country on a regular basis, but we expect only the best from Samsung. (Samsung Score 9/10)
When we looked at Apple.com’s website performance from various locations around the U.S, we found that the site consistently took the longest to load in Utah and Texas, but was the fastest in Florida and North Carolina. It’s intriguing to note that both Florida and North Carolina saw the best load times for both websites, while Texas was one of the slowest for both. AlertBot did catch two instances of slower load times and a slow javascript file in Illinois, but neither problem caused the site to go completely down. (Apple Score 9/10)
For usability, we select a common task that a typical user might want to perform on sites like these. Then, using hands-on testing, we perform the same task on each website while timing how long it takes to complete and how many mouse clicks it takes to get the job done. This time, we decided to approach each site with the intention of purchasing their latest phone. We timed how long it would take from the point of entering the URL into the browser on through to getting the phone into the online shopping cart.
From the point of typing in “Apple.com” and clicking through their site from the phone product pages all the way to the shopping cart, it took 45 seconds (and 7 clicks of the mouse) for us to add a SIM-free 256GB “jet black” iPhone 7 to the online “shopping bag.” (There’s an additional click, however, to view the cart when you’re done adding the phone to it.)
From typing “Samsung.com” into our browser and clicking through to add a Samsung Galaxy S7 Edge 32GB “unlocked” phone into our shopping cart and viewing the virtual bag, it took a shocking 1 minute and 30 seconds (in 5 mouse clicks)! We used Google Chrome as our browser for both websites and the Samsung site froze up twice during the process (in fact, we accidentally added TWO of the same phone to our cart because we were trying to click through to the cart and it was unresponsive). Just to be fair, we tried it again, and it hung up yet again during the ordering process, but this time it was a little under a minute to get to the shopping bag. All of this happened on Chrome’s latest version, too. We know web browsers can be super fickle, though, so we decided to try it a third time, this time with Mozilla Firefox, and it took 20 seconds to get the same phone into the shopping cart. On Apple’s site, for the iPhone, there are a lot more choices – from storage space to phone color – to choose from, so it makes sense as to why that process might take longer. But it is rather alarming that Samsung’s site experienced THAT much trouble while just trying to add their phone to the shopping cart.
Just to compare via Firefox, then, we re-performed the timed test for Apple.com. One could argue that re-tests don’t account for newfound familiarity with either site, but it took 25 second to add the same iPhone 7 to the shopping cart. While that’s a few seconds slower than Samsung, we also didn’t experience any problems on either browser with Apple’s site.
All things considered, here are the Usability scores:
(Samsung Score 7/10) (Apple Score 9/10)
The performance of both sites were very, very good and quite close to one another. Apple’s site just barely edged out Samsung’s on speed and geographic performance, while both matched each other on reliability. Despite their slight differences, they both performed at the top of their game in online performance. However, after factoring in our usability testing, where Apple’s site performed much more consistently, the winner for the very first AlertBot Showdown is clear:
Allentown, PA / September 21, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading TrueBrowser®-based web application monitoring solution, AlertBot, is pleased to announce that they are now SAM certified and are looking to grow their relationships with Federal and State Governments. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions. Downtime of any length can be costly for any website or online business; AlertBot’s Website Monitoring Service uses TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on government websites, login pages and other mission-critical pages. Learn More about Trusted Government Website Monitoring.
“With 85 Global Test Locations operating over 7 Internet Backbones developed during the past decade, AlertBot has established their reputation in real-world private industry applications; this level of website testing and monitoring is both proven and ready for Public Service Deployment as Federal and State Agencies rollout ‘Next Gen’ consumer style, interactive websites,” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “We’re looking forward to showcasing AlertBot’s TrueBrowser® technology and capabilities to Governmental Agencies throughout the country and help them validate their client usage.”
AlertBot serves over 10,000 users spanning 6 continents worldwide with 200 million website checks per month. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives government organizations, including the U.S. Department of Energy, Virginia state government, NOAA, U.S. Marine Corps, and Smithsonian Institution, the information they need to ensure their applications are always running error-free and providing a quality user experience. AlertBot has registered to do business with Federal and State Agencies using the following registrations: DUNS: 624818493; CAGE: 6QP16; NAICS: 518210, 454111, & 334290.
About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.
About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.
###
]]>We enjoyed visiting Chicago, too – for some of us it was our first time there – and took advantage of riding the “El” around town and visiting the 360 Chicago Observation Deck as well as some of the many fine eateries found there.
We had our booth set up in the exhibit hall among hundreds of other exhibitors, including recognizable names like eBay, FedEx, Shopify, UPS, USPS and many more. Many of the booths had great swag to give away. Our favorites included a gift box from Aramex, which included power banks shaped like freight trucks; Blue Acorn with their squirrel-shaped stress toys; Artifi Labs, who were giving away ice cream scoops and free ice cream sandwiches; and Classy Llama with their soft plush llamas and superhero mask and cape sets. Some of the bigger brands had some cool handouts too, like eBay with their USB reading lights and mini journals, PayPal with free sunglasses, or UPS’s cell phone rests and sanitizer spray markers.
Conference attendees who visited the AlertBot booth had the opportunity to meet some of our great staff and talk with us about what AlertBot could do to monitor their ecommerce platform. Each of our booth visitors got to spin our prize wheel for an opportunity to win brand new AlertBot water bottles, AlertBot playing cards, or grand prizes like Back To The Future Flux Capacitor USB car chargers and remote control helicopters. Last, but certainly not least, attendees could use their phones to scan a QR code (or visit win.alertbot.com) to enter an even bigger giveaway to win a DJI Phantom 3 Standard quadcopter drone!
Events like these are great because it allows us all to venture out from behind our computer screens to meet our customer s and prospective customers face-to-face and connect on a more personal level.
If you weren’t able to attend IRCE this year, that doesn’t mean we can’t still talk! Shoot us an email. We’d love to chat with you and tell you why we think AlertBot is right for you – and why we know you’re going to love it!
]]>Allentown, PA / May 24, 2016 / PR Newswire … InfoGenius.com, Inc., a software company and developer of the leading real-time web application monitoring solution, AlertBot, is pleased to announce that they will exhibit at the Internet Retailer Conference & Exhibition (IRCE) 2016 in Booth # 841. The conference will take place June 7-10, 2016 at McCormick Place West in Chicago, IL. At IRCE, AlertBot will be demonstrating its TrueBrowser® Web Application Monitoring solution. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions. Downtime of any length can be costly for any website or online retailer; AlertBot’s Website Monitoring Service uses TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on e-commerce-driven websites, login pages and other mission-critical pages.
“We’re looking forward to showcasing AlertBot’s TrueBrowser® technology and capabilities at the Retail Industry’s Leading E-Commerce Conference and Tradeshow (IRCE)”, states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “Over the past 10-years, AlertBot has been deployed and proven in countless real-world applications by some of the leading names in the e-commerce space and this gives us another opportunity to demonstrate our advanced technology.”
AlertBot serves over 10,000 users with 200 million website checks per month using its network of over 100 locations, spanning 6 continents worldwide. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives businesses including Blue Cross/Blue Shield, Chrysler, Mutual of Omaha, Sony, Microsoft & Dell Computing the information they need to ensure their applications are always running error-free and providing a quality user experience.
About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.
About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.
###
]]>Allentown, PA / April 11, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading real-time web application monitoring solution, AlertBot, celebrates a decade of website and server monitoring. Downtime of any length can be costly for any website or online retailer; AlertBot’s Website Monitoring Service provides best-in-class site monitoring using its TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on e-commerce-driven websites, login pages and other mission-critical pages. AlertBot serves over 10,000 users with 200 million website checks per month using its network of over 100 locations, spanning 6 continents worldwide.
“AlertBot measures every facet of a website to help our clients improve the user experience; our testing helps clients make adjustments that result in measurable gains – for instance, a major e-commerce player measured gains of $1.4 million for every second of response time their platform improved – that small improvement netted them $18 million in revenue!” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “Over the past 10-years, AlertBot has been deployed and proven in countless real-world applications by some of the leading names in the e-commerce space.”
AlertBot’s Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives businesses including Blue Cross/Blue Shield, Chrysler, Mutual of Omaha, Sony, Microsoft & Dell Computing the information they need to ensure their applications are always running error-free and providing a quality user experience.
About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.
About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.
If you’re not quite sure what that is, third party code is usually any code provided by another company or website to plug in / embed a service on your website. For example, you may have a web stats tracking code, a banner ad rotator, or a couple lines of code that drops your Twitter or Instagram feed onto your website. These pieces of code are considered third party code since they’re provided by another source.
Some of the problems that this kind of code can cause may be:
The case of causing inaccurate stats is a particularly interesting one that most people don’t consider. Problems with third party code could render your website’s stats unreliable if the stats code is not fully loading. When this happens, you may only be getting partial information about your visitors or no information at all. If you make business decisions based on those stats, you may be making the wrong decisions based on misinformation.
In the case of third-party code causing slow page load times or loading errors, it affects your visitors’ experiences on your website. Unhappy visitors may choose not to buy from you and often times won’t ever return to your website.
So what can you do in this situation? First off, you’ll want to diagnose the problem to make sure it is indeed the third party code causing the problems. AlertBot is an excellent service to use for finding out what is causing a bottleneck in your load time.
Once you know for sure that it is the third party code creating the issue, here are a few things you can do to resolve problems with third-party code:
So, as you can see, third party code can greatly impact your website. And if you’re experiencing some web performance issues and you’re utilizing third party code, there’s a pretty good chance that code may be the catalyst for those problems.
Sign up for a risk-free trial of AlertBot today and start down the path to better performance for your website. AlertBot can track the performance of all your third-party code and lets you know when it’s causing problems.
]]>Running an AlertBot waterfall chart is just one example of something IT and website managers can do to see how the site is performing with load times. Before you release your site’s new design, complete overhaul or its grand debut, it’s wise to give your site a thorough testing first. A simple test with AlertBot’s waterfall chart can reveal images or third party code that might be clogging up your load time — or many other possible hang-ups. Your site might look fine and dandy, but if the page is taking too long to load in this fickle web-browsing age we’re living in, it could be very costly for your business.
(Above is a real, abbreviated example of AlertBot’s waterfall charts)
In the end, it’s really like a visit to the dentist; they often give you tips and guides on how to prevent cavities and other oral problems, while helping you maintain good oral hygiene. Maybe using mouthwash or flossing daily will help keep your gums healthy and your teeth strong. Likewise, with the right web performance tools and tests, you can ensure quality conversions and hopefully prevent any possible decay in your site’s performance.
See for yourself with AlertBot’s completely free 14 day trial!
]]>What: Internet Retailer Conference & Exhibition (IRCE) is the flagship event for the e-commerce industry. Learn the latest trends in the industry from experts who are implementing the latest technologies and solutions. IRCE 2016 will take place in the world-class city of Chicago, IL.
When: June 7-10, 2016
Where: McCormick Place West, Chicago, IL
At IRCE, AlertBot will be demonstrating its TrueBrowser® Web Application Monitoring solution. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.
Planning to Attend?
If you’re planning on attending, make sure to stop by the AlertBot booth to meet our team, see a demo and get some cool AlertBot swag. If you would like to sit down for a one-on-one demo during the conference, please don’t hesitate to email us ahead of time. Oh, and mention this newsletter announcement when you meet us at our booth and we’ll give you an extra spin on our prize wheel for a chance to win a second prize!
]]>The report examined several sites that had experienced minor to substantial down times over the course of Friday thru Sunday, including NeimanMarcus.com and Staples.com. Our report, however, did not take into account Cyber Monday or the sales-heavy weeks that have followed.
On Cyber Monday, Neiman Marcus seemed to right the ship, while only a few sites appeared to experience no more than about 10-minute failures–including NewEgg.com and Staples–with Gap’s site registering 19 minutes of being unresponsive.
Throughout this month of December, Staples has continued to struggle with short outages; Walmart experienced some brief HTTP errors; while Footlocker, PC Mall, Crate & Barrel and Peapod all experienced just a couple few-minute hiccups (which in some cases could simply just be sluggish load times).
With online sales outperforming brick and mortar sales this Black Friday for the first time, it’s imperative now more than ever to make sure your site stays up and working, error-free. Sign up for a free 14-day trial with AlertBot.com today and start working towards a more successful ecommerce site!
]]>While websites like Walmart, Fanatics and QVC experienced a couple several-minute outages on Thanksgiving Day, one of the sites that seemed to struggle the most on Black Friday this year was the online destination for department store Neiman Marcus. The site even experienced a downtime of two hours in the morning.
Second only to NeimanMarcus.com, however, is online tech retailer Newegg.com, who experienced some slow page load times no doubt due to the heightened traffic. Finally, Staples.com also experienced some short outages, but nothing more than a few minutes each.
Through Saturday and Sunday, it was much of the same with Neiman Marcus, Staples and Newegg, with Walmart seeing a few hiccups and Shutterfly.com experiencing a 45-minute outage due to the server being overloaded with traffic. Sony’s Playstation Network also experienced some significant downtime on Saturday, which also affected their online store.
Downtime of any length can be costly for any online retailer. According to this article by Evolven.com, “The average cost of data center downtime across industries was $5,600 per minute.” Clearly, that would add up real quick – especially on a major shopping day.
With AlertBot’s monitoring services, not only can you be alerted the moment your site experiences an outage or slow load times, but you’ll be able to use the AlertBot charts and reports to find potential hang-ups and future problems that will result in unnecessary downtime.
Give AlertBot a try with our totally free trial period and start seeing how AlertBot can look out for your business to help you prevent serious financial loss and online disasters.
]]>Get Your Website Ready For Holiday Traffic
It’s that time of year again. As we say farewell to summer and prepare for the coming of autumn next week, online retailers are faced with one harsh reality: Black Friday is a mere two months away. And while that may seem like a long time from now to some, now is really the time for preparation. And just like any brick and mortar retailer needs to have their store ready to go with employees on hand to wrangle the shopping masses, websites need to make sure their site is tuned up and ready for an influx of traffic.
If you’re feeling pretty confident that you’re ready and that this warning may seem premature or unnecessary altogether, let’s take a moment to spotlight last year’s Black Friday festivities and pitfalls.
The biggest name to have experienced major website failures last November was electronics retail chain Best Buy. Issues were recorded and reported on throughout the day on Black Friday and it sent social media abuzz with chatter and complaints about the site’s performance—or lack thereof.
Best Buy wasn’t the only one affected, however. Computer company HP’s webstore also experienced failure, while in the UK, online stores Currys (electronics), Argos (department store) and Tesco (groceries) all went down as well.
So what can we glean from this?
If you’re an online retailer, you’re probably already thinking about the holidays and getting prepared, but now is the most crucial time to not only make sure you have reliable website monitoring, but to evaluate your website’s performance so you can make improvements before the big online sale days. And you’re in luck – AlertBot can assist with your performance evaluation and help you rest assured that your site will perform better in time for the holidays. Try it out for free with our 14-day trial.
]]>Use AlertBot To Monitor The Competition
When most of us think of “website monitoring,” we usually think about how it applies to our own websites. However, website monitoring really has more uses than we may realize or consider.
Truth be told, while using AlertBot to keep an eye on our own websites and pinpoint problems that need fixing, we can actually set up monitors for any site—not just our own. This means we can actually monitor the competition as well.
The upside to monitoring the competition is that you can get an idea of how a competing website might be performing from around the world, and gauge whether your website is competing as well in those areas. Furthermore, you can see how long their page load times are and find out what features on their website may be slowing them down. It could help you figure out what to avoid in your own design or focus on what to do better in your market, for example.
You can test-drive this concept with our free, risk-free 14-day trial. Try it out today and start gathering actionable data on your website – and your competition’s!
]]>Web developers know browser compatibility can be a real headache, however, browser compatibility doesn’t just affect web developers. Recently, one AlertBot customer received an alert that their site had failed. When investigating the failure, they found that their site was actually not completely down, but that AlertBot had discovered that their site had stopped working in just one browser. Their website was working fine with Chrome, Internet Explorer (IE), Safari, etc, but had stopped loading with Firefox. Thanks to AlertBot’s TrueBrowser™ Monitoring options, which allowed them to test their website in multiple browsers, they were able to identify the problem with that one browser quickly and fix it.
For web developers, it’s easy to simply open your site in each of the popular web browsers to check it for compatibility, find that it’s working smoothly, and then never follow-up on it again. However, websites, servers and backend resources change often. AlertBot’s TrueBrowser™ Monitors can be set up to check your site regularly with each of the popular web browsers and make sure nothing has changed. So, for example, with AlertBot, you can set up a Test Scenario to check your website with Chrome, another one to check it with Firefox, then another with IE, etc. This way, you’ll know the very instant your site stops functioning within one of these popular browsers.
It’s also just a super easy way to not have to worry about browser compatibility as often. Think about it; these days, web browsers are constantly auto-updating to new versions and web masters are constantly updating their websites. It’s a lot to keep up with–testing your site’s performance with each browser every time this happens–so having something as simple as an automatic browser monitor frequently testing your site’s reliability is one less worry for website owners.
Take the AlertBot TrueBrowser™ Monitor for a spin with a completely free trial and let us start watching your back for you!
]]>Velocity Conference 2015 Recap
The AlertBot team just returned from last week’s Velocity Conference event in Santa Clara, California. We had a great time meeting a lot of people who share our affinity for web performance. And, despite some air turbulence during the trek that rendered more than one of us uncomfortably queasy, we enjoyed the trip from the humid weather in Eastern Pennsylvania to the crisp breezy air of California.
As a VelocityCon sponsor, we had a booth set up in the exhibit hall, which allowed Velocity-goers to peruse various tables showcasing unique and recognizable products and brands (Even NetFlix and Amazon were there?!) and pick up some fun swag along the way. For example, Target had these awesome little plush versions of their canine mascot to give away (which a couple of us snatched up for our little Bots back home), HP had silicone cell phone speaker amplifiers, JFrog had foam frogs and “Batfrog” superhero spoof tees, Verizon offered a pair of ping pong balls, and our booth neighbors (x) Matters gave away old school handset phone receivers you can plug into your cell phone. So, yeah, there were quite a few fun things you could snag from any given booth.
If you visited us at the AlertBot booth, you had the opportunity to listen to us give a little talk on the AlertBot’s monitoring services and then take a spin of our prize wheel (carnival, style!). Many attendees walked away with a cool new remote control helicopter, while others got to grab AlertBot swag like travel mugs, highlighters or Frisbees. We even had a drawing to win a brand new Apple Watch, which we announced on the last day of the conference. The lucky winner even got to take it home that day too (congratulations to Craig T. from Constant Contact!).
Events like these are great because it allows us all to step out from behind the comfort of a desk chair and computer screen to meet our customers in person and discuss our projects face-to-face. Velocity was a nice opportunity for this.
But hey, if you weren’t at Velocity, that doesn’t mean we can’t meet! Shoot us an email. We’d love to talk to you and tell you why we think AlertBot is right for you – and why we know you’re gonna love it!
]]>
But website monitoring can do so much more. AlertBot’s monitoring service collects all kinds of data about your website that is invaluable to any website owner. For example, each time AlertBot tests your website, it analyzes the load time and performance of every piece of your page and will generate a detailed assessment of how long each component takes to load. This helps you identify potential problem areas for your website’s loading time, including every component’s size, transfer speed, load time and more. With this kind of data, you can pinpoint exactly which areas need improvement.
For instance, some site owners don’t realize how their graphically-intensive websites might be causing serious load delays for their users – and maybe even only in a specific region or country in the world. Worse yet, if you’re using a lot of third party code or off-site image hosting on your page, you might not be aware of how it’s affecting your site’s visitors in different parts of the globe.
So website monitoring can do a lot more for you and your business than you might realize. Give AlertBot’s free 14-day, risk-free trial a chance and start learning how to increase your website’s potential right away.
]]>As website owners, uptime is about as crucial as making sure the front door on a shop owner’s local 24-hour business isn’t locked. We need visitors and customers to be able to reach us at all times. AlertBot’s service can ensure that uptime is consistent and reliable. Of all its features, AlertBot’s alerting process is what ultimately gives us website owners peace of mind.
AlertBot’s alerting system differs from most in the way that it works hard to avoid false alarms. No one likes getting an alert that their site is down when it really isn’t, and AlertBot combats this by testing your site’s availability from more than one location before sending you that digital elbow nudge about your site’s downtime. For example, if a test server in New York responds that your site is down (or producing an error) at the moment, it’ll test it from another location—say, California—within 60 seconds. It’s only after the failure is verified from this second location that it will deem the error legitimate and begin alerting. You won’t just be getting an alert based on a brief outage in one isolated location.
The alerting process is versatile as well. You can be alerted via email, text-message or automated phone call, or through any combination of these options. For example, with SMS text messaging, you might get a message sent to your phone from AlertBot specifying what went down. You then can take whatever necessary steps needed—depending on the cause of the error—to get things back up and running smoothly again. AlertBot will continue to test your site’s availability until it is, and you’ll be notified via text once more once it’s back up, with the amount of time your site (or the specified portion or page of your site) was inaccessible displayed as well (whether it be minutes or hours). It’s a great way to remain aware of your website’s performance day or night. It’s also a great way to pinpoint problem areas of our site to know what to fix or improve.
For more information about AlertBot’s alerting services and features, click here.
]]>We’re proud to announce that AlertBot is a Silver Sponsor and will exhibit at O’Reilly’s Velocity Conference in Santa Clara, CA. The conference will take place May 27-29th, 2015 at the Santa Clara Convention Center.
What: O’Reilly Velocity Web Performance and Operations Conference. O’Reilly hosts four Velocity Conferences around the world but the Santa Clara conference is the largest with an expected attendance of 3,000.
When: May 27-29, 2015
Where: Santa Clara Convention Center, Santa Clara, California
At Velocity, AlertBot will be demonstrating its TrueBrowser® Web Application Monitoring solution. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.
Planning to Attend?
If you’re planning on attending, get a discount code to use at registration to get a 25% discount for your conference passes. For those attending, make sure to stop by the AlertBot booth #815 to see a demo and get some cool AlertBot swag. If you would like to sit down for a one-on-one demo during the conference, fill out our form to reserve a time.
Neeraj posted screenshots from his personal Android browser that indicated a new red “SLOW” icon branded next to links for YouTube and even a Google search result (scholar.google.co.in, to be exact). Today, we tried to replicate the same result on an iPhone, but were unable to bring up any “Slow” icons on our search results. (And comments on Neeraj’s report page seemed to reflect similar experiences.)
So what does this mean? It’s possible that Neeraj happened to stumble on a brief Google testing of an upcoming new search result feature, and if this is indeed on the horizon for the near future, website owners may want to do all they can to avoid that little dreaded scarlet branding.
Should this feature come to light soon, now would really be the ideal time to find a Website monitoring solution for your business’s website to ensure visitors and new clients aren’t deterred by Google’s little warning.
Click here for a list of solutions and more info on how AlertBot can help.
]]>We have released the latest version of AlertBot. The release introduces a new interface when adding a monitor along with a new “Last 10 Waterfall Captures” report.
Monitor Type Interface:
We have done away with the old school dropdown boxes that were displayed when selecting your monitor type and created a new, more descriptive menu, which includes key features and capabilities of each monitor type. We have also changed the monitor names to better reflect the type of test they perform. Our goal with these changes is to make choosing the right monitor type easier.
New “Last 10 Waterfall Captures” Report
The new report shows the last ten waterfalls captured and is part of the larger Website Performance, Transaction Performance and Failure Analysis Reports. This is useful when you’ve recently had performance related errors or if you want to see how your performance varies between tests.
Moving To Next Generation “Real Browser” Monitoring
Another highlight of this release is that customers now have the ability to add our TrueBrowser® Website Full Page Monitors on their own. Previously, our team added the TrueBrowser Website Full Page Monitors while we were deploying additional TrueBrowser Test Stations to increase our capacity. Now that we have built up our capacity, we are opening up the doors for everyone to add this monitor type! This change is part of our move to focus our offering on TrueBrowser Website Monitoring, which provides the best error detection and the most accurate performance reporting.
AlertBot currently offers three types of Real Browser Monitoring: Full Page and Mobile Page, used to monitor individual pages, and the Multi-Step Web Transaction for monitoring online processes like logins and buying a product online.
]]>Not All Website Monitoring is Created Equal
Over the years, we’ve spoken with thousands of IT professionals about website monitoring. One of the biggest misconceptions people have had about website monitoring is that it’s a commodity industry — as if one size fits all. The fact is that it’s actually the complete opposite; every monitoring service is custom-developed: What and how the websites are monitored, what errors are detected, what data is gathered, what features the solutions offer — it’s all different.
Because there are so many disparities between website monitoring services, and because the tools can be confusing at first glance, we’ll break each solution into key components below. This guide will give IT professionals the knowledge necessary to evaluate critical monitoring solutions, troubleshoot slow page load times, and compare real browser monitoring vs. simulated browser monitoring.
In the first part of the guide, we’ll focus on the most important difference between tools: whether or not the monitoring service uses a real web browser when testing.
“Simulated Browser” website monitoring only simulates one part of a web browser: the initial request for the HTML file. While this is adequate for measuring availability, simulated browsers are limited to only providing performance data up until the first byte of data is received (referred to as the “Time to First Byte,” or TTFB). This means that all objects queued for transfer after the first byte—images, videos, JavaScript—are not being loaded, tested, or measured. The result creates inaccurate performance reports and each of these unloaded items has the potential to severely impact the user’s experience and page load-time.
We’ve created the below graphic to visually showcase the “cut-off point” of a simulated browser — the first byte.
In contrast to simulated (synthetic) browser monitoring, “Real Browser” monitoring tests a website just like a real user would — by opening up a browser, rendering the page, and executing Rich Internet Applications (RIAs) like AJAX, JavaScript, Silverlight, and Flash. There is no difference between a real user and how a monitoring service loads a web page, so you’re getting the most accurate and actionable testing and performance data through this solution.
What Simulated Monitoring Sees |
What Real Browser Monitoring Sees |
For those IT professionals who want real browser monitoring but need to justify it to higher-ups or other departments, this section is for you.
Increase Sales – We have been offering real browser monitoring for a few years. In that time, we have uncovered and resolved a number of site-crippling technical issues for our customers. These customers continue to use the data AlertBot provides to constantly make improvements to their site, further minimizing page load-time. Some customers observed an immediate sales increase from improvements made using AlertBot. We’ve even been told of a few instances where companies doubled or tripled their sales.
Protect Your Brand – Whether the monitored website is a corporate page or an e-commerce website handling product sales, a fast website is important. In fact, search engines like Google include page speed a ranking factor. A fast, functional website shows pride in how your brand is represented online. In addition to that, if there is a problem, real browser monitoring is the most advanced external website monitoring available and will help you pinpoint the exact issue.
Stakeholder Transparency – If your marketing department checks with IT to see if there is a problem with the website because traffic or sales are down, real browser monitoring is the best solution. From an IT standpoint, real browser monitoring provides a complete picture of what users are experiencing. This can help by setting up the marketing folks for alerts or give them access to reports so they know how the website is doing.
When searching for a monitoring service, it’s about finding the best tool for the job. Every company today needs to stay on top of page load times. A website can slow down at any time for thousands of reasons; knowing the root cause of slowdowns right away is something only real browser monitoring provides.If you’re unsure what type of browser a website monitoring service is using, feel free to ask in the comments section and we’ll reach out with the answer.
]]>We have released the latest version of AlertBot. This is a minor release that includes performance enhances to reports and a bunch of fixes.
Here’s a Breakdown:
– Improved the performance of reports
– Fixed Transfer Rate report key on accounts with a large quantity of monitors
– Fixed broken links in Quick Links side bar
– Fixed Black links on Quick Stats reports that happened on accounts with a large quantity of monitors