As more organizations move to Microsoft’s Office 365 platform, new conversations are occurring around how to best leverage the investment beyond the messaging, file storage, and desktop productivity applications that comprise the core of the offering.
As a leader in the technology industry for the last 21 years, I have led many different teams, ranging in size from 2 to 300. Yes, there are MANY factors to consider when leading teams.
We are all heavily dependent on technology, especially in the workplace. I work in healthcare, and technology is the epicenter of the transformation of the industry.
AUTHORED BY DEANA EILAND, VICE PRESIDENT OF DELIVERY @ GUIDEIT
Teams are a fundamental part of our work and personal lives. But, creating a team is not the same as creating a team that works. Just as joining a team is not the same as performing as a team member. Very simply, teams do not work without teamwork. Active, collaborative teamwork towards a common goal makes all the difference.
How do you build a team that works?
- Be Aware of How You Work – Know your strengths and weaknesses, hold yourself accountable, course-correct and modify your approach if needed to ensure you are leading from a position of strength.
- Get to Know the Rest of the Team – Invest the time to know your team’s individual strengths and weaknesses, how they are wired and what motivates them to excel beyond what is expected.
- Clearly Define Roles & Responsibilities – Each of your team member’s responsibilities should be interconnected and dependent on one another. Unique strengths and differences can convert into a powerful united force when aligned properly.
- Be Proactive with Feedback - Feedback is a two-way street and is key to staying on track and course correcting when needed.
- Acknowledge and Reward – People love recognition and appreciate respect. Take the time to give your team the accolades they have earned and deserve.
- Always Celebrate Success! – In today’s fast-paced world, people often don’t take the time to take a step back and truly appreciate what it took to cross the finish line. Don’t ignore it. Your team’s accomplishment was likely with some sacrifice by team members. Celebrating their success and overall impact of the achievement is critical.
"The way a team plays as a whole determines its success. You may have the greatest bunch of individual stars in the world, but if they don't play together, the club won't be worth a dime."
AUTHORED BY JOHN LYON, CHIEF OF FINANCE @ GUIDEIT
The reality of organizational life is never black and white. More often than not, accountability is muddled and people are not fully aware of the direct connection between their efforts and results. We tend to keep ourselves from being productive simply by not holding ourselves accountable for our actions. It is of utmost importance to first hold yourself accountable for your own obligations, commitments, and actions before participating in a team environment.
Accountability is about improvement. Improve oneself, and the team will respectively improve. Tom Price nails it when he said, "One person's embarrassment is another person's accountability." We are all in a leadership role, as all team members are responsible for contributing to the success of the organization. As leaders, without accountability, an organization would cease to exist. You not only betray yourself by not owning up to your responsibilities, but your team as well.
The major leagues would never send a player on the field who has consistently missed mandatory practices, for obvious reasons; such an action would diminish the collective hard work of the other team members, and scores would decline rapidly. The same goes for any type of team. There must be rules and adherence. A pattern toward advancing success. And that pattern begins with the individual.
It is up to me and no one else to make sure I am doing what I know I should be doing. When someone has to hold me accountable, because I failed to do what I should have done, I have a serious conversation with myself. My belief is that no one should have to hold me accountable for my actions, responsibilities and goals. While I appreciate others helping me get better, I am the one that must hold myself to a high standard.
I am convinced if you want to advance your life personally or professionally, you must hold yourself accountable for your actions, responsibilities, and goals. Think about it. Commitment is a choice and a decision that should be made responsibly. Why should it be someone else’s job to make sure you are doing the things that you know you should be doing?
AUTHORED BY MARK JOHNSON, VICE PRESIDENT OF MANAGED SERVICES @ GUIDEIT
You pick up the paper or watch the news and it has become an all too common occurrence. What used to surprise us is now sadly routine – breaches of cyber security. In the early days these breaches were usually just an annoyance – most simply focused on defacing public facing websites. Plug the vulnerability, re-upload your homepage, and you were back in business. Almost seems quaint now doesn’t it? Today the stakes are much higher, both from a commercial standpoint and from an international security standpoint as well.
While much of the preventive focus for cyber-security justifiably falls on IT, the role of each and every user is critical as well. From password security, awareness of social engineering threats, and prudent behavior when it comes to attachments and web-browsing, many enterprises are only as strong as their weakest user. One of GuideIT’s managed services customers places significant emphasis on the importance of user awareness in their overall cyber-security program, and recently completed a Phishing exercise I thought worth sharing.
To establish a baseline from which to measure the results of an upcoming training program focused on Phishing, every employee was sent an outside email informing them that their email storage quota had been exceeded, and directing them to click an enclosed link to address the issue. The organization’s Information Security policy dictated that they forward suspicious emails to the GuideIT Service Desk, who would either confirm/deny the authenticity of the email, or open a ticket to the customer’s Security Team for review. So how’d they do?
· 90 people clicked the link - they failed the test outright.
· 50 people forwarded the note to the Service Desk AFTER clicking on the link, many asking, “Hey the link didn’t work; how can I get more storage??” They also failed the test.
· 40 people forwarded the email to the Service Desk without clicking the link, and identified the email as a potential Phishing attempt – BRAVO!
Obviously no harm came of this exercise. But had the threat been real, the outcome might have been different. The lesson? First it’s worth emphasizing that this particular customer has an active IT Security Program using both internal dedicated IT resources, and the assistance of an outside Security vendor to audit and support their efforts. Yet the majority of people who received the Phishing attempt “took the bait”. With this particular customer, the next time a user fails a Phishing attempt they will be directed to a mandatory online training module to raise their awareness on the risks of Phishing – a great motivator huh?
The lesson to me is that even with strong internal programs to raise cyber awareness, your work is never done. And if you don’t have programs in place like this customer, give serious thought to how your organization would perform if put under the same microscope.
Stay tuned; this customer plans additional testing over the course of the year to gauge the effectiveness of their training efforts. I’ll be sure to provide an update when they do.
AUTHORED BY DONALD C. GILLETTE, PH.D., DATA CONSULTANT @ GUIDEIT
This week we will explore, in my opinion the best BI product currently on the market; Redshift by Amazon Web Services (AWS).
Amazon Redshift delivers fast query performance by using columnar storage technology to improve I/O efficiency and parallelizing queries across multiple nodes. It uses standard PostgreSQL JDBC and ODBC drivers, allowing you to use a wide range of familiar SQL clients. Data load speed scales linearly with cluster size, with integrations to Amazon S3, Amazon DynamoDB, Amazon Elastic MapReduce, Amazon Kinesis or any SSH-enabled host.
Redshift’s data warehouse architecture allows the user to automate most of the common administrative tasks associated with provisioning, configuring and monitoring a cloud data warehouse. Backups to Amazon S3 are continuous, incremental and automatic. Restores are fast! You are able to start querying in minutes while your data is spooled down in the background. Enabling disaster recovery across regions takes just a few clicks.
Security is built-in. Redshift enables you to encrypt data at rest and in transit (using hardware-accelerated AES-256 and SSL) isolate your clusters using Amazon VPC, and even manage your keys using hardware security modules (HSMs). All API calls, connection attempts, queries and changes to the cluster are logged and auditable.
Redshift uses a variety of innovations to obtain the highest query performance on datasets ranging in size from a hundred gigabytes to a petabyte or more. It uses columnar storage, data compression, and zone maps to reduce the amount of I/O needed to perform queries. It has a massively parallel processing (MPP) data warehouse architecture. Parallelizing and distributing SQL operations, it takes advantage of all available resources. The underlying hardware is designed for high performance data processing, using local attached storage to maximize throughput between the CPUs and drives, and a 10GigE mesh network to maximize throughput between nodes.
With just a few clicks of the AWS Management Console or a simple API call, you can easily change the number or type of nodes in your cloud data warehouse as your performance or capacity needs change. Amazon Redshift enables you to start with as little as a single 160GB DW2 Large node and scale up all the way to a petabyte or more of compressed user data using 16TB DW1 8XLarge nodes.
While resizing, it places your existing cluster into read-only mode, provisions a new cluster of your chosen size, and then copies data from your old cluster to your new one in parallel. You can continue running queries against your old cluster while the new one is being provisioned. Once your data has been copied to your new cluster, Redshift will automatically redirect queries to your new cluster and remove the old cluster.
Redshift allows you to choose On-Demand pricing with no up-front costs or long-term commitments, paying only for the resources you provisions. You can obtain significantly discounted rates with Reserved Instance pricing. With affordable pricing that provides options, you’re able to pick the best scenario to meet your needs.
Stay tuned for part 3 next week. In the meantime, what's your view on Redshift or other tools? Any challenges or projects you want to discuss?
AUTHORED BY DONALD C. GILLETTE, PH.D., DATA CONSULTANT @ GUIDEIT
Let’s take a look this week at the benefits of external hosting of our data warehouse.
With prices so affordable from well-known providers like Amazon (AWS), Microsoft (Azure), and Google, there is no business reason to host a data warehouse internally. All three refer to this process by using the phrase “Cloud Computing”. No offense to the soft and hardware marketing professionals reading this post, but I really think the noun/verb combination “Cloud Computing” is an overloaded phrase. Data warehousing is an object-oriented programming term, and the two cannot be compared. Not to date myself, but thirty years ago we had a process called “time sharing services”. This too was available from various vendors. These services allowed us to run several types of statistical simulations/business analytics. Cloud Computing is nothing more than what we did thirty years ago, yet on a much larger scale.
Data in a data warehouse is managed in a columnar format based on some kind of key, (unique or non-unique). This enables analysis to be done in de-normalized rows created from a fact table. In the world of mainframe days this was called an inverted list. Today the cost of doing this exact same thing is geometrically lower.
Google, AWS and Azure all offer similar partners providing SAAS in the same variety of business categories, however each has its area of specialization. All leverage their extensive data networks and processing capacities on a worldwide scale. Years ago I worked at a co-location center in Dallas that was attached to a major telecom provider. I was installing hardware in a rack one Friday and noticed a large cage and racks being installed next to our location. Returning the following Monday I found roughly 5,000 servers placed in that new rack space, all humming away. It was Google installing a regional center for web searching capacity. I thought that was an enormous economy of scale in 2006. Imagine what it is in 2015!
Google has the edge in web metrics. Any metric about a web site or usage, user, business, demographic or anything else imaginable about a web site; Google has it remembered. Not only remembered, but codified and classified as well. The only drawback from my perspective is that their tools don’t seem user friendly. Google also has a unique “what if analysis” for digital marketing which neither of the others seem to address.
AWS has created Redshift. This first-rate product has an excellent architecture built to obtain extremely high query performance on datasets. These datasets can range from a few hundred gigabytes to a petabyte or more. It uses columnar storage, data compression, and zone maps to reduce the amount of I/O needed to perform queries. It also uses parallel processing in its data warehouse architecture, parallelizing and distributing SQL operations to take advantage of all available resources. Costs are minimal, changing frequently depending upon the competition, but pricing is very affordable. AWS also competes well in the SAAS market and functions parallel to those options provided by the others.
Azure is the newest participant in the threesome of data warehouse providers. Not only are they becoming competitive in the market, they are the logical choice if you are a Microsoft shop. One large transition issue Microsoft data warehouse shops usually face is in changing from SQL 2008 R2 to SQL 2012 or 2014. Another consideration is that Azure pricing and configuration is somewhat confusing and their customer service tools may add to the confusion.
All of these services are more than capable of solving a data-warehousing requirement. It’s just a matter of which one meets the needs of your business. In the following weeks I’ll guide you through implementation of a data warehouse in each of the above vendors and provide specific examples and output.
AUTHORED BY WENDY DURRE, CUSTOMER EXECUTIVE @ Guideit
What do you think when you hear the “touchy-feely” side of IT? Am I referring to a new, softer keyboard, something that works completely in Emoji’s? Try again! Believe it or not, TECHNOLOGY impacts our life not only in a practical way, but in an emotional way.
What I’m saying is **YOU** have an impact on others and the world as an IT professional. If you have a career in IT, whether it be a Service Desk Agent, Project Manager, Developer, Marketer, or Executive Leader, you have experienced the touchy-feely side of IT...and you may not even realize it.
Have you ever thought about how your work impacts others? And how do you feel about your work? According to a recent study, only 39% of employees believe that the meaningfulness (contribution of their job to society as a whole) of their job is important to overall job satisfaction. 61% are passionate about their work, and 71% say they frequently put all their effort into their work. The takeaway here is that employees who find their work meaningful and fulfilling are more likely to be engaged and do their work well.
Here’s an example. Does your work assist in the creation of IT jobs or increase employment opportunities in the IT space? Your impact may look something like this: You hire a candidate. That candidate has a family. That family lives in a home purchased through a realtor who helped them find the best location close to work. That candidate also works with a team within the company. That team services the needs of their customer. That team works on maintaining the new EMR application adopted by a medical practice treating and assessing ER patients. We are definitely beyond keyboard, servers, and code.
By digging deeper and evaluating what our job is, we are able to understand that not only are we maintaining systems, we are impacting lives. Every day as a result of your work, you impact hundreds of people. It may seem like your job is a small part of a big process, but to those on the receiving end of your efforts, it is huge!
I challenge you this week to see the scope of your impact on others through your job. I’d love to hear how it changes the way you see yourself in your company and community. So please leave your comments below and make it a great day!
AUTHORED BY DONALD C. GILLETTE, PH.D., DATA CONSULTANT @ GUIDEIT
Last weekend I read a very interesting book entitled “The Quants: How a New Breed of Math Whizzes Conquered Wall Street and Nearly Destroyed It” by Scott Patterson. I highly recommend this as a must read for all of you that are doing Business Intelligence and especially Data Mining.
So what is Data Mining? Basically it is the practice of examining large databases in order to generate new information. Ok, let’s dig into that to understand some business value.
Let us consider the US Census. Of course by law, it is done every ten years and produces petabytes (1 petabyte is one quadrillion bytes of data), which are crammed full of facts that are important to almost anyone that is doing data mining for almost any consumer based product, service, etc. Quick sidebar and promo…in part 2 of this micro series, I will share where databases like the census and others can be accessed to help make your data mining exercise valuable.
So if I was asked by the marketing department to help them predict how much to spend on a new advertising campaign to sell a new health care product that enhances existing dental benefits of those already in qualified dental plans, I would have a need for data mining. With this criteria, I would, for example, query the average commute time of people over 16 in the state of Texas. It is 25 minutes. We would now have a cornerstone insight to work from. This of course narrows the age group to those receiving incomes and not on Social Security and Medicare. In an effort to validate a possible conclusion, we run a secondary query on additional demographic criteria and learn that a 25 minute commute volume count doesn’t change. Yet we learn that 35% of the people belong to one particular minority segment.
I pass this information to the Marketing Department and they now have the basis to understand how much they should pay for a statewide marketing campaign to promote their new product, when to run the campaign, and what channels and platforms to use.
DATA MINING, can’t live without it. Next week we’ll cover how and where to mine.
AUTHORED BY Ron hill, Vice president, sales @ GUIDEIT
It was a sunny winter day and I had just started as the Client Executive at one of the largest accounts in the company. Little did I know, clouds were about to roll in. The CIO walked into my office and sat down with a big sigh. She communicated that they were ending our agreement and moving to a different service provider. We had 12 months. It required immediate action by our company, implications in the market would ensue, and an environment of uncertainty was born for our team of more than 700 people providing service support.
This was no time to defend or accept defeat. We had to act. Our account leadership team readied the organization for the work ahead and imminent loss. We formally announced the situation to the organization. There were tears and some were even distraught. Our leadership team had not faced this situation before. The next 12 months looked daunting.
Regardless, it was time to lead. We created a “save” strategy and stepped into action beginning with daily team meetings. We invested time prioritizing and sharing action items and implications about information systems, project management, and the business process services. It was our job to operate with excellence, despite the past. It was our job to honorably communicate knowledge to the incoming service provider. One of the outcomes of our work was a weekly email outlining past week accomplishments and expectations for the next week. The email often included a blend of personal stories and team success. We even came up with a catchy brand for the email…Truth of the Matter. It turned out to be a key vehicle that kept our teams bonded and informed. Our leadership team used it as a vehicle to help maintain trust with the team.
During our work, we also began to rebuild trust with the customer as we continued to support them in all phases of their operation. Because of our leadership team’s commitment to service, transparency, and integrity, the delivery team was inspired in achieving many great milestones during those 12 months. We were instrumental in helping our customer achieve multiple business awards including a US News and World Report top ranking. We also found ways to achieve goals that established new trends in their industry. Before we knew it, the year had come and gone and we were still there.
Reflecting back, since that dark day when the CIO informed me that we were done, it was actually the beginning of more than a decade-long relationship. The team had accomplished an improbable feat. In the end, it was the focus of our leadership to come together with a single message and act with transparency…letting their guard down to build an environment of trust with the team and with the customer. This enabled all of us to focus on meeting the goals of the customer, together.
AUTHORED BY DONALD C. GILLETTE, PH.D., DATA CONSULTANT @ GUIDEIT
Last week we declared, “If you don’t embrace the fact that your business’ greatest asset is your data, not what you manufacture, sell or any other revenue-generating exercise, you will not exist in five years. That’s right…five years”.
This week, I’m introducing a perspective on leveraging Big Data to create tangible asset value. In the world of Big Data, structure is undefined and management tools vary greatly across both open source and proprietary…each requiring a set of skills unique from the world of relational or hierarchical data. To appreciate the sheer mass of the word “big”, we are talking about daily feeds of 45 terabytes a day from some social media sites. Some of the users of this data have nick names like “Quants” and they use tools called Hadoop, MapReduce, GridGain, HPCC and Storm. It’s a crazy scene out there!
Ok, so the world of big data is a crazy scene. How do we dig in and extract value from it? In working with a customer recently, we set an objective to leverage Big Data to help launch a new consumer product. In the old days, we would assemble a survey team, form a focus group and make decisions based on a very small sample of opinions…hoping to launch the product with success. Today we access, analyze, and filter multiple data sources on people, geography, and buying patterns to understand the highest probability store locations for a successful launch. All these data sources exist in various electronic formats today and are available through delivery sources like Amazon Web Services (AWS) and others.
In our case, after processing one petabyte (1000 terabytes) of data we enabled the following business decisions…
- Focused our target launch areas to five zip codes where families have an average age of children from two to four years old with a good saturation of grocery stores and an above average median income
- Initiated a marketing campaign including social media centered on moms, TV media centered on cartoon shows
- Offered product placement incentives for stores focusing on the right shelf placement for moms and children.
While moms are the buyers, children are influencers when in the store. In this case, for this product, lower shelves showed a higher purchasing probability because of visibility for children to make the connection to the advertising and “help” mom make the decision to buy.
Conclusion? The dataset is now archived as a case study and the team is repeating this exercise in other regional geographic areas. Sales can now be compared between areas enabling more prudent and valuable business decisions. Leveraging Big Data delivered asset value by increasing profitability, not based on the product but rather on the use of data about the product. What stories can you share about leveraging Big Data? Post them or ask questions in the comments section.
Authored by Donald C. Gillette, Ph.D., Data Consultant @ GuideIT
If you don’t embrace the fact that your business’ greatest asset is your data, not what you manufacture, sell or any other revenue-generating exercise, you will not exist in five years. That’s right…five years.
Not so sure that’s true? Ask entertainment giant Caesars Entertainment Corp. their perspective. They recently filed Chapter 11 and have learned that their data is what creditors value. (Wall Street Journal, March 19, 2015, Prize in Caesars Fight: Data on Players. Customer loyalty program is valued at $1 billion by creditors). The data intelligence of their customers is worth more than any of their other assets including Real Estate.
Before working to prove this seemingly bold statement, let’s take a look back to capture some much needed perspective about data.
Space and resources were expensive and systems were designed and implemented by professionals who had a good knowledge of the enterprise and its needs. Additionally, very structured process(s) existed to develop systems and information. All this investment and structure was often considered a bottleneck and an impediment to progress. Critical information such as a customer file, or purchasing history, was stored in a single, protected location. Mainframe Business Intelligence offerings were report-writing tools like the Mark IV. Programmers and some business users were able to pull basic reports. However, very little data delivered intelligence like customer buying habits.
Enter the Spreadsheet
With the introduction of the PC, Lotus 123 soon arrived in the market. We finally had a tool that could represent data in a two dimensional (2D) format enabling the connection of valuable data to static business information. Some actionable information was developed resulting in better business decisions. This opened up a whole new world to what we now call business intelligence. Yet, connecting the right data points was a cumbersome, manual process. Windows entered the scene and with it, the market shifted from Lotus to Excel carrying over similar functionality and challenges.
Client Server World Emerges
As client servers emerged in the marketplace, data was much more accessible. It was also easier to connect together, relative to the past, providing stakeholders real business intelligence and its value to the enterprise. With tools like Cognos, Teradata, and Netezza in play, data moved from 2D to 3D presentation. Microsoft also entered the marketplace with SQL Server. All this change actually flipped the challenges of the Mainframe era. Instead of bottlenecked data that’s hard to retrieve, version creep had entered the fold…multiple versions of similar information in multiple locations. What’s the source of truth?
Tune in next week as we provide support for data being your most valuable asset with a perspective and case study analysis of a Business Intelligence model that uses all technology platforms and delivers the results to your smartphone.
Authored by Mark Johnson, VP Managed Services @ GuideIT
For anyone who has spent the bulk of their career in healthcare IT, a venture into an in/out-patient setting for one’s own health is always an interesting experience. Throughout the process you can’t help but say – “it’s 2015 and we’re still doing this?” For me it was in preparation for that first (dreaded) “over 50 procedure”. It started with far too much paperwork, some of it redundant, and some of it collecting information I had already provided in their portal (sadly with no linkage to my HealthVault account). Then I arrived in the clinic and was not only faced with more paperwork, but music that was playing way too loud on a morning that I was already grumpy from not being able to eat the day prior.
But then, everything changed. Once I left the waiting room, every clinician I interacted with was simply outstanding. From the prep nurse, to the anesthesiologist, to the doctor himself. They actually seemed to really and truly enjoy their work! And their positive approach to delivery of care translated directly to an extremely positive patient-clinician interaction.
So while there’s plenty of time to talk about how to better leverage IT in the delivery of care, for me today this is simply a “hat’s off and well done” to the people that really make such a tremendous difference in our lives – clinicians and their staff.
Oh, and if you’re wondering – it turns out it was a very good thing I had this taken care of. So listen to your physician.
Authored by Jeff Smith, VP Business Development @ GuideIT
A national healthcare provider was ready to move from multiple PBX systems to a VOIP-centric model for their communications…the transition, one piece of a broader multi-source IT strategy. Simple enough, right? Not exactly. This transition was a monster…500 locations and more than 1100 buildings. Additionally, the provider cares for patients, the majority of whom are in some form of acute need. Sure, any business requires clean execution in a project of this magnitude. But few businesses have the sole mission of caring for the acute health needs of their customers like healthcare providers do for their patients.
Truly lots of moving parts in this story…a story representing one part of the bigger picture. A critical attribute of this provider’s success was ensuring the right IT Governance function encompassing their multi-source strategy.
So what is the right governance? According to Gartner, governance is the decision framework and process by which enterprises make investment decisions and drive business value. Take that one step further applying IT and the definition is, “IT Governance (ITG) is the processes that ensure the effective and efficient use of IT in enabling an organization to achieve its goals. IT demand governance (ITDG—what IT should work on) is the process by which organizations ensure the effective evaluation, selection, prioritization, and funding of competing IT investments; oversee their implementation; and extract business benefits.”
Now consider “why” the right IT Governance is critical in a multi-sourcing environment. When multiple vendor partners serve in support of the broader business mission, the opportunity to optimize outcomes for the business is huge. And so is the risk. The opportunity is there because the organization can leverage the specialization of subject matter experts necessary in a highly complex IT environment driven by growing business demands. One partner specializes in apps, another in cloud infrastructure, another in mobility, and so on. They all bring optimal value in areas critical to support the business…thus the core value of multi-sourcing.
Therein lies the risk too. Without the right governance model, no clear accountability exists to ensure open collaboration and visibility across specialists. Specialists will act in silos. And we all know how silos hurt business. Simply put, the “why” for the right governance is to optimize outcomes through maximizing specialization while minimizing the risk of “silo-creep”. The right governance closes the gap between what IT departments think the business requires and what the business thinks the IT department is able to deliver. Organizations need to have a better understanding of the value delivered by IT and the multiple vendor partners leveraged…some of whom are ushered in through business stakeholders.
Because organizations are relying more and more on new technology, executive leadership must be more aware of critical IT risks and how they are being managed. Take for example our communications transition story from earlier…if there is a lack of clarity and transparency when making such a significant IT decision, the transition project may stall or fail, thereby putting the business at risk and, in this case, patients lives at risk. That has a crippling impact on the broader business and future considerations for the right new technologies to be leveraged.
Conclusion: the right IT Governance is critical to optimizing business outcomes