As we enter another year and another Patch Tuesday, we see that Microsoft has now made the patch notifications that little bit harder for the average customer, by stopping the Advance Notification Service (ANS). Along with the regular Patch Tuesday updates, Microsoft publishes an advanced notification on the first Friday of each month, to give security teams a good idea of what to expect on Patch Tuesday.
They haven’t scrapped it altogether though, they are still offering ANS to paying users. The reasons, according to Microsoft, are that customers no longer use ANS with many simply waiting until Patch Tuesday. However, it could be argued that for smaller businesses that can’t afford a service like this, it could have an impact on how they deploy patches.
Fear not however, all of Verismic’s customers will still have all patches fully tested and rolled out as per agreed schedules via Verismic Syxsense.
A light patch update
We’ve all enjoyed our Christmas break and so, it would seem, have security researchers. This month’s Patch Tuesday is fairly light with only eight patch updates, with only one rated Critical. I’m in a good position to say that there appears to be nothing special or particularly significant about January’s updates – it’s especially rare to be in a position to say that as there are usually at least one or two updates that deserve special attention due to the seriousness or uniqueness of the vulnerability.
As ever, we have broken down the patch updates for you to give you a better understanding of what systems could be affected and have included the independently assessed Common Vulnerability Scoring System (CVSS) score from US-CERT.
The only Critical patch update this month, MS15-002 has a CVSS score of 9.3 [out of a possible 10], this is a relatively serious patch and definitely one that needs to be the top priority to patch. It’s a buffer overflow vulnerability that could allow remote code execution, which is caused by the Microsoft Telnet service improperly validating memory location. Attackers can exploit this vulnerability by sending specially crafted telnet packets to a Windows server that could then enable the attacker to run arbitrary code on a target server.
Amazingly, the other seven updates are all rated Critical by Microsoft’s standard, but if we take a look at the table below, US-CERT thinks that only three are actually quite serious (MS15-001, MS15-003, MS15-004), whereas the other four updates are rated as 5.0 and below. Whilst these are vulnerabilities that need to be patched, US-CERT has identified that the chances of the vulnerability being exploited are probably quite low and having assessed the potential impact (again likely to be low), have given the vulnerabilities a low risk score.
It’s such a light Patch Tuesday this month that working out which patches to prioritise is fairly straightforward. Get the Critical update done first, and then work through the list. If, like Verismic, you want to take into account the CVSS scores, then the table below is listed in order of most serious to least – use this to prioritise your patch roll outs as we will for our customers.
Vulnerability in Windows Telnet Service Could Allow Remote Code Execution (3020393)
Vulnerability in Windows Components Could Allow Elevation of Privilege (3025421)
Vulnerability in Windows Application Compatibility Cache Could Allow Elevation of Privilege (3023266)
Vulnerability in Windows User Profile Service Could Allow Elevation of Privilege (3021674)
Vulnerability in Network Policy Server RADIUS Implementation Could Cause Denial of Service (3014029)
Vulnerability in Network Location Awareness Service Could Allow Security Feature Bypass (3022777)
Vulnerability in Windows Kernel-Mode Driver Could Allow Elevation of Privilege (3019215)
Vulnerability in Windows Error Reporting Could Allow Security Feature Bypass (3004365)
Questions need to be asked of Patch Tuesday and Microsoft’s approach to it, says Robert Brown.
SC Magazine | Dec 17, 2014
The next Patch Tuesday, Microsoft’s usual day to issue security updates for its software, is looming again. It will be the 13th of January 2015, then in February and so on. It’s so frequent it’s easy to treat it as a’ business as usual’ exercise, so humdrum that it requires no second-thought or intelligence.
However, it really does need that a second-thought. Patching is obviously essential, companies do need to protect themselves from known software vulnerabilities, but there are problems with Microsoft’s approach to patching and simply installing every patch with the quick click of a button could be costly; worse, you might just see the Blue Screen of Death (BSOD) across your device fleet.
Microsoft’s approach to patching is very much a ‘fire and forget’ exercise where it issues patch updates each month and expects businesses to roll out the patches as soon as possible. However, this is where your second thought is needed, as many IT managers will attest, they cannot and, should not, deploy them right away. IT must take a phased approach and test the patch updates before rolling them out, helping to mitigate any problems.
Just take a look at MS14-066 – a lot of users reported problems when implementing the update, forcing Microsoft to reissue the patch. Imagine if every business had implemented that immediately! If there is a compatibility issue with a patch and systems need to be rolled back, this extends downtime and can impact the business’s bottom line.
Compatibility aside, my real issue with Patch Tuesday is Microsoft’s rating system. It is relatively simple to follow:
‘Critical’ – A vulnerability that could allow remote code execution without user interaction or where code executes without warnings or prompts.
‘Important’ – These vulnerabilities are where the client is compromised with warnings or prompts and whose exploitation could result in compromise of data.’
Moderate’ – The impact is mitigated by numerous factors such as authentication or non-default applications being affected.
‘Low’ – The impact is comprehensively mitigated by the characteristics of the component.
Keep in mind that Microsoft self-certifies vulnerabilities for its products and November’s Patch Tuesday contained 14 separate patches fixing almost 40 vulnerabilities as well as an out-of-band patch a week later; five of the updates, including the out of band patch, were rated by Microsoft as Critical, eight Important and two Moderate.
Where to start? With the obvious, surely? Patch the Critical updates first and take the rest in turn. Better still, do them all at once! This couldn’t be more wrong. My advice would be to take Microsoft’s vulnerability ratings with a respectful pinch of salt and start looking at independently assessed scores, such as the Common Vulnerability Scoring System (CVSS) to get a more informed view. Each month US-CERT uses CVSS to rate all of Microsoft’s patch updates the same day they’re released, giving a much better understanding of the risk a particular vulnerability poses to the business.
If we look again at November’s Patch Tuesday, US-CERT gave the out of band patch, rated as Critical by Microsoft, a score of 10.0 – that’s as serious as it can get and gives a good starting point for patching activities. It’s now top priority.
Three other Critical patches were scored 9.3 by US-CERT, which suggests Microsoft has got this right and they should be the next area of focus. Time to get to work.
But, the last remaining Critical patch only scored 6.8 by US-CERT. This is a really important discovery, because actually six other patches, some deemed only Moderate or Important by Microsoft, were rated higher than 6.8 by US-CERT. In other words, some of those Moderate and Important patches should be tackled before the last remaining Critical patch.
This isn’t a one-off slip from Microsoft either. In October’s Patch Tuesday, three Critical and two Important updates were all rated 9.3 equally by US-CERT. Those two Important updates might have been delayed by IT managers if relying on Microsoft’s rating only.
Microsoft is providing a great security service that everyone is thankful for, but it does need policing by a second source. The critical is not always critical and sometimes the Moderate needs urgent attention too.
It seems that it was only yesterday that patch/update Tuesday came and went, yet the next one is looming already.
As an IT guy I actually look forward to seeing the types of vulnerabilities that have been discovered in Microsoft’s products. Some are obviously more interesting than others, such as the vulnerability in Schannel, but what they all have in common is that they actually do pose a threat to your business.
We all know that patching is a vital process in keeping our businesses safe, but I do have some issues with Microsoft’s approach to patching. It’s very much a “fire and forget” exercise for them, whereby patch updates are released each month and your IT team is then expected to roll them out across the business.
Whilst this may be the most efficient way of releasing patches from Microsoft’s point of view, there are many instances where simply rolling them out is not an option. IT teams need to take a phased approach and test the patch updates before rolling them out, helping to mitigate any problems such as the dreaded blue screen of death.
Case in point was November’s MS14-066 update – there were a lot of reported problems when implementing the update, with Microsoft having to reissue the patch. Imagine if every business had implemented that immediately!
Keep in mind that Microsoft self-certifies vulnerabilities, and have a fairly easy to follow rating system:
• Critical – A vulnerability that could allow remote code execution without user interaction or where code executes without warnings or prompts.
• Important – These vulnerabilities are where the client is compromised with warnings or prompts and whose exploitation could result in compromise of data.
• Moderate – The impact is mitigated by numerous factors such as authentication or non-default applications being affected.
• Low – The impact is comprehensively mitigated by the characteristics of the component.
If we take a look at November’s Patch Tuesday, there were a total of 14 separate patches fixing almost 40 vulnerabilities as well as an out-of-band patch a week later, five of which were rated as critical. So how do you prioritise these five if they’re all rated the same? Which vulnerability do you patch first?
When rolling out patches, it’s all well and good to do so if your business is located in one or two premises, but what if your business has a number of remote locations? Retail, transportation and oil and gas are all good examples.
If you were to take a large retail store open 24 hours a day, there needs to be a window of time where the systems are taken offline so they can be updated. Microsoft’s approach would be to suggest patching the Critical vulnerabilities first, and then work through the rest.
At Verismic, we provide a service to our customers to ensure that their entire IT infrastructure remains as up-to-date as possible, which includes rolling out any patch updates from vendors. We do this by creating a baseline – what is going to be the most important update for the business, and then we work backwards. It’s important to do this because, as we said, many businesses simply don’t have the time or even the bandwidth to roll out all of the patch updates at once.
To create this baseline we use three different measurements; vendor severity (that would be Microsoft’s self-certified rating), the Common Vulnerability Scoring System (CVSS), and the total number of vulnerable systems in the customer’s environment. By measuring against three separate metrics we can get a much better understanding of the risk a vulnerability really poses.
My advice would be to take Microsoft’s vulnerability ratings with a respectful pinch of salt and start looking at independently assessed scores, such as CVSS. Each month US-CERT uses CVSS to rate all of Microsoft’s patch updates the same day they’re released, giving you a much better understanding of the risk a particular vulnerability poses to your business.
Patching is invaluable to protecting your business. By taking a phased approach to updating systems and creating a baseline to understand the risk of each vulnerability, you can get a much better idea of which patches you should be prioritising first.
Verismic is pleased to announce we have been awarded Most Innovative Product 2014 for Syxsense.
Ashely Leonard, CEO said “It has been an exciting year for us with the launch of Syxsense, being recognized as one of the Top Innovative Products of 2014 is a great way to end the year.”
The Best in Biz awards honours companies teams, executives and products for their business success and is the only independent business awards program judged by members of the press and industry analysts.
One of this year’s judges Mark Huffman, Consumer Affairs said “In the Internet age, it has never been more important to ensure your customers have a positive experience and, should there be a problem, to address it. These companies “get it,” and that’s not only good for them, but good for customers too.”
In the short span of a decade, innovative electronic devices such as laptops, tablets, smartphones, and Internet engagement channels have made an indelible impact on everyday life, revolutionizing the means and speed in which people communicate, socialize, and purchase goods and services. Combining the personal and business use of high-tech devices and applications, however, is a more recent phenomenon that’s blazing an irreversible trail
While the growing movement of versatile devices in the workplace provides flexibility and offers a range of options to increase employee productivity, it puts the modus operandi of back-office technology in peril, leaving IT departments precariously teetering on the edge of falling from hero to zero.
The Driving Force Behind Advancing Technologies
The consumerization of IT, coupled with Bring Your Own Device (BYOD), is more than just a trend. Steered by a younger, more mobile generation of employees—raised with connected devices and uninhibited by the notion of work/life balance—BYOD is the driving force behind the inspiration of advanced technologies with the potential to make the workplace more efficient and employees more productive. Yet, this same force that is driving technology in a direction of infinite possibilities is also at work in an opposite direction, significantly impacting IT administrators who feel pressured to protect their technology universe with black-hole policies where nothing is allowed to pass through nor escape.
From the outside, some see IT departments as having a reputation for using “no” as the default response to newer technology or operational requests, whether to buy more time or as a genuine attempt to protect company policies and procedures. Although not an ideal or sustainable solution, IT departments may be at risk of becoming marginalized within enterprises as the speed of technology surpasses the speed of IT response. As today’s employees can walk into a store, buy a phone, and access company email within minutes, bypassing IT completely, a “no” from IT often only results in an unproductive and unnecessary game of cat and mouse—inevitably ending in frustration and internal conflicts.
Contrary to popular belief, IT does not intentionally oppose innovation, forcing employees to search for covert means to bypass IT and ultimately risk company security. However, the onus will invariably fall on IT administrators—whose survival depends on a willingness to adapt—to search for solutions that redirect policy-based collaboration and mitigate shadow IT, rather than identify new ways to block users from accessing sensitive information and connecting to company networks.
While providing unmatched technical expertise, IT departments face unique challenges and important decisions, particularly in relation to their shifting roles within the organization, along with employee demands regarding accessibility and flexibility. Bridging the chasm will require administrators to not only provide a common goal and a starting point from which all players have an equal advantage, but also transform from a technology provider to a technology partner. In other words, IT must evolve from the traditional department of “no” to the supportive and collaborative department of “now.”
Harnessing the Power of the Cloud
Traditional IT provisioning is often a slow and manual process, while new cloud-based solutions are automated, allowing for increased flexibility, improved agility for administrators, and enhanced efficiency that helps support a mobile workforce. With cloud management, organizations can cost-effectively support and manage a range of endpoint systems, from desktops to virtual workspaces, while improving access to vital applications and databases. In addition, these advanced solutions optimize performance and support virtualized environments without adding complexity, allow administrators to quickly find and fix infrastructure issues, provide end-to-end performance monitoring and configuration management, minimize disruptions, and reduce time, cost, and risks during migration to new environments.
As new cloud technologies emerge, collaboration between IT and the business is essential. To seize an expanded role while keeping pace with innovation, IT teams must take the lead and assume the position of driver and trusted advisor—allowing organizations to create competitive advantages by utilizing cloud solutions to solve complex technology challenges.
While many enterprises already employ a hybrid of on- and off-premise solutions, how many end users have Dropbox or Box and utilize Salesforce or Office 365? As this major shift occurs—with or without the consent of IT—organizations are bound to question if the IT department is an enabler or roadblock to innovation.
Collaboration and Innovation
By determining where and how IT departments can best support the enterprise and enhance the productivity of employees, they are sure to foster a culture of collaboration and innovation. Ultimately, this protection of the organization’s most valuable assets will secure IT’s place and guide companies through the next wave of new technology.
Ashley Leonard is the president and CEO of Verismic Software and a technology entrepreneur with 25 years of experience in enterprise software, sales, operational leadership and marketing, including nearly two decades as a successful senior corporate executive and providing critical leadership during high-growth stages of well-known technology industry pioneers. Verismic Software, Inc. provides cloud-based IT management technology and “green” solutions focused on enabling greater efficiency, cost-savings and security control for users, all while engaging in endpoint management.
Analyst:Andrew Donoghue18 Nov, 2014 ~ Best known as a PC power management and endpoint management supplier, the company is planning to add dedicated datacenter IT asset management tools, as well as server power management, to its cloud-based IT management suite.
Buzzwords are a fact of life in the technology profession. Whether you’ve been in the industry for 30 years (remember WYSIWYG?) or for five (netiquette, anyone?), it’s a good bet you’ve incorporated techspeak into your everyday conversation, maybe without even knowing it.
As the global data tsunami continues to build, and a new wave of technologies from the consumer world hits IT, it’s not surprising that the buzzword count has surged. Here’s a look at eight of the hottest buzzwords being used today.
1. IoT (Internet of Things) or IoE (Internet of Everything)
The IoT is the chatty network that’s formed when the devices and “things” we use in our everyday lives – automobiles, thermostats, appliances, fitness bands, even toothbrushes – talk to each other through embedded technology and Web connectivity. While this term has been around for at least a decade, it’s only recently that the general public has fathomed its impact on our lifestyles.
“In the not too distant future, consumers will be able to tell their house to turn on the lights, unlock the doors, open the garage and report on how much milk is left in the fridge, all from the comfort of their car on their commute,” says Jeff Remis, branch manager of the IT division at the Addison Group.
“As technology continues to evolve, the more connected and automated every aspect of our lives will be.”
As a result, IoT is almost always brought up when industry pundits discuss “disruptive” technology trends. “Working for Ericsson, I hear this almost every day. With ideas like connected vehicles, M2M, and so on, this is very relevant,” says Samuel Satyanathan, director of strategy and engagement at Ericsson.
With the number of wireless connected devices exceeding 16 billion in 2014, according to ABI Research, which is 20% more than in 2013, some prefer the term “Internet of Everything.” “This is just an expansion of the “Internet of Things” to emphasize that everything is becoming a connected device, from mobile phones, appliances and cars, to animals,” says Ken Piddington, CIO at MRE Consulting. Indeed, ABI forecasts the number of connected devices will more than double from the current level, to 40.9 billion in 2020.
2. BYOE (Bring Your Own Everything)
Of course you’ve heard of BYOD, or “bring your own device,” which is the trend among businesses to allow employees to use their own personal mobile phones, tablets and laptops for work. But with the growth of mobile devices, including wearable technologies, some say the new umbrella term will be BYOE, or “bring your own everything,” Piddington says.
Already, Cognizant Technology Solutions has coined the term BYOHD, or “bring your own health device,” referencing the growing number of embedded or wearable devices that enable patients to collect data on vital signs, genetics, health history, fitness levels, activity levels, body-mass index, sleep patterns and more.
3. Dual Persona
Thanks to BYOE, another buzzword making the rounds is “dual persona,” which refers to mobile phones that enable people to maintain separate environments for personal and business use on the same device. “Users can have both a work and home profile simultaneously, and by separating these two personas, they can segment and protect personal and corporate data,” says Ashley Leonard, president and CEO of Verismic Software, a global provider of IT management solutions delivered from the cloud.
When Google first released its plans for augmented reality glasses, or Google Glass, it was met with skepticism and a healthy number of parody videos. Even today, the device is seen by many as “odd but interesting,” as one blogger puts it. Still, while commercial success eludes most forms of wearable technologies today, the idea of wearing devices that would automatically consume, share, transmit, analyze and present vital information to or about us is no longer seen as a joke.
“This is a very trending development at the moment, from health devices to new mobile technologies, and is seeing rapid expansion and advancement,” Leonard says.
The wrist has been deemed the most realistic place for a wearable to be worn; witness the assortment of activity trackers and smartwatches that have made their way to the market from industry heavyweights like Samsung, Sony and Apple. However, it seems no area of the body will go unconsidered, with companies developing smart rings,insole sensors, glucose-level detectors inserted under the skin, posture-detecting pins and more. According to IDC, wearables have moved out of the early-adopter realm, with shipments exceeding 19 million units in 2014, more than tripling last year’s sales, and swelling to 111.9 million units in 2018, resulting in a CAGR of 78.4%.
5. Quantified Self
The buzz around wearable technologies is driving interest around what some call the “quantified self,” Leonard says, which is a movement geared toward gathering data about any aspect of your daily life and using that information to optimize your behavior. Chris Dancy, a top proponent of the trend, claims to have lost 100 pounds and kicked a two-pack-per-day smoking habit by logging and analyzing data on his everyday activities, including sleeping, eating and even his moods. Numerous meetups and forums now exist to support people interested in quantifying their own lives.
“If the advent in wearable technology is any indication, this term is one that will stick around, and Iam a huge fan of this idea,” Remis says. “Wearables are emerging to track insulin levels and even the air quality around you. The smart watch will be a big-ticket item this holiday season – and it’s just the beginning.”
6. XaaS (Everything as a service)
It all started with “software as a service,” but the as-a-service trend soon spread to a multitude of areas, including platform, infrastructure, storage, communications, network, monitoring and business process as a service. It’s no wonder, then, that many now simply say “everything as a service,” or XaaS (pronounced “zaas”). “I think it will start to become more widely used, as ‘everything’ is becoming available as a service,” Piddington says, even outside the technology realm. “You’ve got cars (ZIP Cars), housing (AirBnB), legal (LegalZoom) — the list continues to go on and on.”
Others prefer the more traditional nomenclature. “Personally, I am not a fan of this word and would still rather go with specific ones, like SaaS, PaaS, etc.,” Satyanathan says. For SaaS fans, Piddington offers the verb form, “SaaSified,” or the process of taking a traditional on-premise application and moving it to the cloud or making it available as a service. “I first heard this from a vendor of mine as they were describing how they were moving their core products to the cloud. I’ve been using it ever since,” he says. At least it’s more specific than cloud-ified.
7. Small Data
Once buzzwords hit their peak on the hype-o-meter, it’s not uncommon for industry pundits to rethink the meaning behind the word and hit upon more relevant variants. This is why you may have heard talk of “small data” and even “dark data,” Piddington says. Because big data is sometimes overkill for certain purposes, more people are starting to talk about small data, which according to the Small Data Group, connects people with timely, meaningful insights (derived from big data and/or “local” sources), and is organized and packaged – often visually – to be accessible, understandable, and actionable for everyday tasks.
Dark data, meanwhile, is the operational data that businesses collect but don’t optimize for competitive purposes, Piddington says. According to Gartner and other sources, the hazards of dark data range from lost business opportunity and higher than necessary storage costs, to security risks.
Ransomware refers to malware that infects a user’s computer and typically encrypts sensitive data until a ransom has been paid, Leonard says. An example is CryptoLocker, a damaging strain of malware that uses encryption to lock the most valued files of victim users. Many malware variants are now being created, “proving that ransomware is going to be an ongoing problem for home users and businesses alike,” Leonard says.
For companies, these types of attacks could have devastating consequences as local drives and corporate network data are all potentially encrypted, he points out. “Many victims who actually paid the ransom later reported that their data was never released, demonstrating the need for requirements of good security practices and strong IT management technology that allows all network endpoints to be actively managed and patched,” Leonard says.
So, where will the next buzzwords come from? If not from tech marketers, the answer will likely come from the “digital native” set, or the younger generations who have never known what it is like to not have constant and easy connectivity to the Web. For his part, Piddington keeps his ear tuned to the conversations of his 12-year-old son and his friends. Hence his use of the word “laggy.” “This is what he and his friends call a slow Internet connection. I seem to hear it said often when a large group of them are playing Minecraft,” Piddington says.
Brandel is a freelance writer. She can be reached at email@example.com.
The workplace trend of BYOD (Bring Your Own Device) is nothing new. What remains unclear, however, is the burden of ownership, cost and security. When employees bring their own cell phones, laptops or tablets to work, there’s a fair chance they’ve personally purchased those devices—data plans and all. In fact, some employers today require a BYOD policy, with no intention of paying for any of it. As one CIO bluntly put it, “Well, we don’t buy their pants either, but they’re required for the office.”
Fortunately, not all employers take such a cynical approach to workplace reimbursement, nor do they subscribe to a one-size-fits-all BYOD policy. While many view the trend as a potential win-win for everyone, the need for clarity is apparent. At least that’s what the California Court of Appeals decided when it handed down a ruling in August 2014 regarding the workplace trend. In Cochran v. Schwan’s Home Service, the court stated:
“We hold that when employees must use their personal cell phones for work-related calls, Labor Code section 2802 requires the employer to reimburse them. Whether the employees have cell phone plans with unlimited minutes or limited minutes, the reimbursement owed is a reasonable percentage of their cell phone bills.”
This ruling solidified the responsibility of employers throughout the state of California to now provide reasonable reimbursement to all employees using their personal cell phones for work-related calls.
Indirectly, the ruling opened up a Pandora’s Box, unleashing ambiguous questions and concerns regarding data security, liability and actual reimbursement percentage figures—for all devices.
Just the thought of required reimbursement has left many business owners and CIOs feeling uncertain about the reality of BYOD’s future. While the practice isn’t exactly new, the trend is contemporary enough for a few larger companies to consider the recent court decision a death knell.
Before we throw the BYOD baby out with the bathwater, let’s examine the facts of this widely misunderstood case. First, the ruling pertains exclusively to employee cell phones. Second, the now-required reimbursement is based on a “reasonable” percentage—partial, not complete; and finally, California is the only U.S. state affected by this decision so far.
While the court decision will undoubtedly have an impact on BYOD practices throughout the U.S., the benefits of the trend unarguably outweigh the deficits. BYOD was established to accomplish objectives for both the employer and employee. In theory and in practice, BYOD gives employees freedom to utilize cutting-edge technology, which has the capacity to not only enhance their own job performance but also benefit the corporate entity or employer, who also garners the additional benefits of lowering overhead costs and alleviating liability for devices connecting to the corporate network.
The trend, when properly implemented and regulated, has the ability to grant employees access to enterprise data from a single device. It also potentially benefits the IT department by eliminating the need to manage these personal devices. For example, if an employee downloads a pirated movie onto a work device their employer (the owner of the device) could be held legally liable; however, with BYOD, the device is owned by the employee so the liability lies with them personally.
Down the Rabbit Hole
Perhaps the real debate lies with provisions and compliance. In response to the California court ruling, the National Law Review recently advised employers to revisit their company cell phone policies. This call for review is a good start and should prompt employers to instate more comprehensive BYOD policies designed to protect the privacy of both the employee and the corporation. Companies and employees would also greatly benefit from clearly defining their “percentage of reimbursement,” shifting the liability to the center, and firmly differentiating business and personal use. On the other hand, this could lead to more concerns regarding ownership and responsibility of home Internet connections and cable bills. Drawing a line in the sand will be an on-going challenge—at least for now.
In the meantime, enterprise solutions currently deployed by California companies need not be affected by the recent ruling, as some of the more comprehensive options—made with enterprise-grade security features in mind—have the ability to proactively monitor and manage their environment from any web browser, meaning the type of device used should have no effect on employee productivity and corporate security.
Reconfiguring the System
If BYOD vanishes from our corporate landscape, the only viable alternative will be to take a step backward. By chaining employees to outdated or unsuitable corporate-owned devices and software that require maintenance and careful monitoring, companies risk the real possibility of not only impeding an employee’s performance but also discouraging an already skittish workforce—a high price to pay.
If nothing else, the ruling will push us in another direction; one where new enterprise solutions are required in order to navigate uncharted waters. BYOD isn’t dying; it’s evolving.
ABOUT THE AUTHOR: Ashley Leonard is the president and CEO of Verismic Software—a global industry leader providing cloud-based IT management technology and green solutions—and a technology entrepreneur with 25 years of experience in enterprise software, sales, operational leadership and marketing, including nearly two decades as a successful senior corporate executive and providing critical leadership during high-growth stages of well-known technology industry pioneers. He founded Verismic in 2012, after successfully selling his former company, NetworkD—an infrastructure management software organization. In his present role, Leonard manages U.S., Australian and European operations, defines corporate strategies, oversees sales and marketing, and guides product development. Leonard works tirelessly to establish Verismic as the leading provider of IT endpoint management solutions delivered from the cloud by building beneficial industry partnerships and creating a strong, innovation-driven culture within the Verismic workforce, all while delivering returns to Verismic’s investors. Verismic’s latest offering, Syxsense , is an agentless, cloud-based IT management software solution that is revolutionizing the way IT professionals engage in endpoint management.
ABOUT VERISMIC: Verismic Software, Inc. is a global industry leader providing cloud-based IT management technology and green solutions focused on enabling greater efficiency, cost-savings and security control for users, all while engaging in endpoint management. Headquartered in Aliso Viejo, Calif., Verismic is a growing and dynamic organization with offices in four countries and 12 partners in nine countries. Over the past two years, Verismic has worked with more than 150 companies ranging from 30 to 35,000 endpoints delivering a variety of solutions for organizations of all sizes as well as managed service providers (MSPs). Verismic’s software portfolio includes the first-of-its-kind agentless, Syxsense ; Power Manager;Software Packaging and Password Reset.
SALT LAKE CITY, UT and ALISO VIEJO, CA–(Marketwired – Oct 22, 2014) – Verismic — a global provider of IT management solutions delivered from the cloud — today announced the expansion of its cloud software operations with the opening of a new development center in Salt Lake City.
“Salt Lake City is an ideal location for great software development talent, allowing us to continue developing industry-leading cloud technologies,” says Verismic President and CEO, Ashley Leonard. “Our presence in the burgeoning cloud technology industry strategically positions us for growth as we continue to develop innovative solutions to complex infrastructure problems.”
Headquartered in Orange County, Calif., and with offices in the U.K. and Australia, Verismic made its mark by transforming IT management with Syxsense — an agentless, cloud-based IT management software alternative that is revolutionizing the way IT professionals engage in endpoint management. The company also offers a growing product suite of IT support and green technology solutions.
Verismic relocated its Chief Technology Officer, Mark Reed, from Florida to lead the building of the development team in the Salt Lake City region. The expanding company has hired great talent and expects to continue this growth with further staff additions through the rest of 2014 and in to 2015.
“The skill level within the Salt Lake City area is impressive, and we have been thrilled thus far with the interest in our expansion to the region,” says Reed, a Salt Lake City native.
The newest Verismic Software office is located at 175 West 200 South, Salt Lake City, UT 84101 — in the Historic Firestone Building within the heart of Salt Lake City.
For more information on Verismic’s steady growth and innovative solutions, visit www.verismic.com.
Verismic Software, Inc. is a global industry leader providing cloud-based IT management technology and green solutions focused on enabling greater efficiency, cost-savings and security control for users, all while engaging in endpoint management. Headquartered in Aliso Viejo, Calif., Verismic is a growing and dynamic organization with offices in four countries and 12 partners in nine countries. Over the past two years, Verismic has worked with more than 150 companies ranging from 30 to 35,000 endpoints delivering a variety of solutions for organizations of all sizes as well as managed service providers (MSPs). Verismic’s software portfolio includes the first-of-its-kind agentless, Syxsense; Power Manager;Software Packaging and Password Reset. For more information, visit www.verismic.com.
Ashley Leonard, President and CEO of Verismic explains his view on the imperative of simplifying Endpoint management
It’s now well accepted that employees use multiple devices in the workplace. Desktop PCs have been augmented with laptops, tablets and smartphones. The Internet of Things will make the penetration of internet connected devices into the corporate world even greater.
The risk to the corporate network caused by unmanaged and potentially unpatched devices, commonly called endpoints, is significant. After all, it only takes one rogue to create a security flaw, so thousands could wreak havoc. Traditional endpoint management tools fail to protect businesses by being cumbersome. They are too complex, function heavy, unwieldy, and too demanding of resources: especially people and infrastructure.
Endpoint management tools have grown in complexity. Vendors add more and more functions to their core product, often unnecessarily, and all too often failing to help organisations control endpoints quickly and efficiently.
When speaking to organisations, from the smallest to the largest, 99 per cent of the time they primarily want asset inventory and remote control tools. That’s what they need urgently and use frequently. Customers also use software deployment and patching but only in around 75 per cent of cases. The remaining functionality of endpoint management tools is generally wasted, confusing and delaying the endpoint management process.
As a result of the excessive functionality, the User Interface (UI) of traditional tools inherits this complexity too. For IT, it becomes harder to find their way around the UI, which inevitably leads to additional costs for supplier training services or even worse, administrators giving up and performing tasks the old way.
It’s also quite often the case that traditional endpoint management tools actually require dedicated people, systems administrators, to manage the tools and keep them running, such is their complexity. Without the right people how do you even know the tool is running efficiently and effectively? It might not even be running at all. If your business doesn’t have that person or team, you’ll need to hire.
That’s another unwanted cost and another delay to managing devices – and costs are not just limited to people and training either. Traditional endpoint management tools also generate additional costs for servers, software and maintenance. This is usually a significant upfront cost as well as an ongoing maintenance cost. Some of these tools even require servers at each site within the organisation.
MANAGE YOUR ENVIRONMENT, NOT YOUR MANAGEMENT TOOL
Endpoint management tools should remain simple, focused and flexible. Here’s what businesses should be demanding:
A product which starts with the primary requirement for asset inventory, remote control, software distribution and patching, with additional functionality available instantly
They need a simple UI, but with the flexibility to interrogate the system in more detail if required
Naturally, they need low monthly payments with no long-term contract
Businesses need endpoint management tools, which are quick to deploy and provide rapid asset discovery, even for modern environments which operate BYOD policies, virtual environments and mobile device fleets. This means using endpoint management tools which operate from the internet using agentless technology, and do not require the installation of clients that require constant updates and patches
Finally, endpoint management tools should operate from the cloud, because today’s endpoints are inside and outside the firewall. Cloud endpoint management is also better suited to Managed Service Providers, who frequently support customers outside the firewall.
We’re seeing fewer and fewer businesses sign up for on premise software and an increasing demand for cloud services. Businesses neither want nor need to worry about hardware costs and the recruitment of systems administrators.
In 2014, flexibility and simplicity is the name of the game. Endpoint management providers and tools which can’t demonstrate these core principles are destined for the endpoint scrapheap. NC