Unique architecture for each business-critical application or standardize for efficiency?I spend a lot of time talking with IT leaders around the world, so I get a first-hand account of the struggles, tradeoffs, and effective approaches of their IT transformation and journey to cloud, especially in dealing with business-critical databases and applications, like Oracle, SQL Server, Exchange, SharePoint and SAP. Frankly it’s the part of my job that I value the most, and there’s always a concentration of these discussions at EMC World. I consistently hear stories of vast landscapes of business-critical production, development and test environments that consume nearly 2/3 of the total IT budget. Why? Each business-critical environment seems big enough for its own unique infrastructure; they’re too important to clump all together in one big homogenized infrastructure; and well, it’s always been done this way.But the results have been less than spectacular.Sure, there’s been progress across the industry. “Standardization” seems to be everyone’s theme. Oracle, Microsoft and SAP have all been offering strategies to standardize within their own field of vision. Take Oracle’s “engineered systems” for example. It applies standardization vertically and promises optimized levels of IT performance and IT productivity – at least for a part of IT. That doesn’t mean that standardization is a compromise — it just means that standardization needs to be broadly applied to be effective for all business-critical applications.A vertical approach for a specific database, custom-written application, or one vendor’s stack is not easily leveraged across a virtual cloud computing environment.We have to guard against the illusion of greater IT efficiency for one database or application that may actually create barriers to efficiency for IT as a whole and increase TCO.EMC faced this challenge as we standardized our own business-critical applications infrastructure that included Oracle, SAP and Exchange. Like many of our customers, our experience proved that virtual infrastructure can achieve the standardization gains across IT while being able to optimize specific application and database requirements.I provided specific examples of this at my keynote address during EMC World. Just a small amount of flash in virtualized Oracle databases in online transaction processing (OLTP) and Data Warehouse workloads delivers extreme levels of performance. How much? Nearly 3 million random I/Os per second for Oracle OLTP environments and over 28 GB/second read scan performance with over 20 TB/hour data loads for Oracle data warehouses. More importantly, this standardized approach of flash and virtualization makes a measurable impact on the long-term TCO. A recent post by Wikibon.orghighlights the impact of how a small amount of flash, combined with virtualization of Oracle production database servers, can dramatically impact hardware, software and maintenance TCO over a 3-year period.The best news is that this is a repeatable practice. To help our customers optimize their unique Oracle environments, EMC has created an open, online community for Oracle customers to engage directly with EMC’s global solutions experts. Here, Oracle DBAs and IT infrastructure teams can access dozens of tested and proven solutions, training materials and case studies from EMC/Oracle customers who have achieved impressive performance and efficiency results virtualizing their Oracle database environments.When it comes to performance and efficiency for business-critical apps, whether it be Oracle databases, SQL Server, Exchange, SharePoint or SAP, you don’t have to settle for a compromise.
Servers are the heart of the data center, and as hardware continues to evolve more advanced software is required to fully harness the technological advances in server design. Performance and scalability are obvious enhancements but one of the most important aspects is the human interface into your servers and overall infrastructure.Considering the server lifecycle, it’s important to think of the entire cycle from deployment to management to even retirement of servers. At the core of every Dell PowerEdge server is iDRAC with Lifecycle Controller. iDRAC ensures a consistent, unified interface across the product line, eliminating learning curves through intuitive interfaces. iDRAC is agent-free so you don’t need an OS or hypervisor to start deploying your servers. To further enhance and simplify management, Dell is the only provider using HTML 5 in its systems management.Outside of more intuitive, simplified systems management and scalable higher performing hardware customers are also looking for integrated proven solutions that can be delivered either ready to deploy or are certified building blocks on which to build a foundation.One of the highlights of this evolution is the Dell PowerEdge FX2 platform combined with VMware VSAN. On the surface alone, the platform proves to be the most powerful 2U VSAN cluster in the world, and with the versatility of VSAN licensing you can easily add an all flash storage array to further scale performance and increase resource utilization.The configuration in question started with three FC430 nodes and three 8 drive all flash LUNs. This led to a 10X increase in performance versus the legacy solution. Adding an additional node and LUN increased performance to 16X that of the legacy solution. At this point, CPU was around 50% utilization, to further scale the solution and demonstrate the versatility of VSAN an SC4020 all flash array was added, pushing performance to 32X that of the legacy solution.Dell’s FX2 and VMware’s VSAN combine to form a powerful combination. By using the Dell Performance Analysis Collection Kit (DPACK), a free IT infrastructure planning and collaboration software, you can measure the performance of your own environment and compare it to the reference architecture. This will allow you to see how an FX2 and VSAN combination will support your existing IT infrastructure.For the first time, reference architectures are personalized, providing a direct comparison between your own environment and a reference architecture. With all the architecture choices out there I encourage you to leverage DPACK and get the clarity of how a new architecture could support your next project.
Dell Financial ServicesHow Dell is helping customers get the best technology solutions through leasing and other payment models that best match their budgetOverview of the principal job categories within DFS -Legal, Credit, Pricing, Compliance, Risk etc.Security and fraud in a banking environment. HRStep by step process on how to write a resume and cover letterInterview PreparationSpeed interview exercisePersonal branding-practice session Are you fully equipped with the resources to best support your children in their academic career choices?Do you understand what new jobs will evolve? What training will be essential for them?We can help you!Transition Year (TY) School students needing an understanding of their career options, plans for the future and fields of study for their Leaving Certificate, face many doubts. They all have their own questions, requiring personalized responses catered to their personal situations.These are the questions that Dell wants to address by helping TY School students reinforce their choice of orientation and find answers.We recently held our second official work experience programme for Transition Year students and had the pleasure of meeting a total of 36 students with very diverse profiles and interests. The feedback was phenomenal!‘‘It was, in my opinion, a very instructive week, it gave me a better understanding of my skills and how they relate to my career possibilities,” said Jack, a recent student of the programme.What is the purpose of the Transition Year (TY) programme?It’s a win-win situation. Among our goals for 2020 is to connect the youth of today with a more promising tomorrow through the power of technology. As we pursue our mission to enable people everywhere to grow and thrive, Dell is committed to helping students make more informed choices about their future. And because our goal is to recruit talent from around the world, we want to be as close as possible to our potential candidates.On the other side, the TY week-long structured programme allows students establish their first contacts with the business world. It is a way for them to learn more about themselves and develop life skills that will prove important for years to come.Dell is a truly diverse organization. Our team members come from many different backgrounds, and we work with customers globally in a multitude of sectors and business areas. For the students, that means there is a world of opportunity to create a diverse and challenging career.‘’This initial contact with the business world is very important. Many teenagers find themselves intimidated or not in their comfort zone in front of professionals. I came here to confront this new world and it’s a real opportunity for my development,’’ said Ellen, a recent student. ‘’By Friday, I no longer felt as an intruder. Everyone was so friendly and more than willing to help out.’’So how does it work?It is a practical discovery!It starts with presentations of the company and the wide range of functions, all aimed to help our students learn more about our culture and career offerings. Various workshops, presentations and fun team building activities are set up in the following days allowing students to spend time with professionals in action and in their work environment.Examples of topics covered:ITDigital transformation and discover what’s nextOverview of the principal duties and responsibilities of a Technical Support Engineer, key skills requiredJob shadowing-calls listening At the end of the week, the students got a valuable insight into our company in general and were accustomed to our IT environment and the work we do in different roles. In fact, 83% of our students said that the work-experience had made them even more determined to pursue a career in Dell.What we heard from our students:‘Planning for the future is very important even if those plans may change. Be ready to bounce back.’‘At Dell, it’s never boring, always challenging.’‘With passion, is how people chose a career.’Take advantage of our TY Programme!Walk up to us, introduce yourself, and join us for a week of learning, networking and connect with the future.
Another opportunity to hear from Michael Dell is coming at the SXSW Conference in March. Dell will be joined by Clay Johnston, the inaugural Dean of the Dell Medical School, to discuss “When Health Care Goes High-Tech.” Conference attendees can also see innovation in tech and meet other disruptive leaders making transformation real at THE EXPERIENCE coming from Dell Technologies at SXSW. “Here I am, supposed to be going to college and I’ve got this thriving business in my dorm room,” Michael Dell recently told Guy Raz when being interviewed for his “How I Built This” podcast.It’s the story that most people are familiar with when they think of Dell. And while those dorm room computer sales may have grown into today’s Dell Technologies company, it’s not where the story really begins.No, before he was buying computers, “souping them up” with more capability and reselling them from the campus of The University of Texas at Austin, Dell had a fascination with how things worked and an innate acumen for business.Dell told Raz he had a wide variety of businesses as a kid – from selling baseball cards to a stamp auction, to working in a gold coin and jewelry store buying item for resale. But to me, it’s his story of selling newspaper subscriptions that really gives insight into his ability to understand customers.He said he observed three things that helped him formulate a plan that would earn himself an income equal to my first job out of college when he was just 17 years old:If you sounded like the people you were talking to, they were much more likely to buy the newspaper from you,People that were getting married were much more likely to buy the newspaper, andPeople that were moving into a new house or residence were also far more likely to buy the newspaper.So, he lined up some high school buddies to go to local county courthouses and bring back public information on who had applied for marriage licenses, then sent those people letters with newspaper subscription offers. And he went to local condominium and apartment complexes that were under construction and pitched them on trial subscription offers for their new residents.“I did plenty of things that didn’t work, but that worked, so I kept doing itShare“I did plenty of things that didn’t work, but that worked, so I kept doing it,” he told Raz.That willingness to try many things and tenacity to keep at it when they didn’t always work probably helped when it came time to try to reassemble some of the things he took apart.You see, while a fascination with his father’s adding machine, led to the purchase of his first electronic calculator at age seven or eight. And the proximity of a Radio Shack store between home and school meant much time hanging out there checking out new technologies. Just looking at them wasn’t enough.“What else would you do?” Dell replied when Raz was amazed to hear that he’d taken apart an early IBM PC he bought to determine that the $3,000 system was actually made from about $600 worth of parts. (Now you really see the beginnings of that dorm room business.)“I wanted to understand it,” Dell explained. “And to understand it, you had to take it apart.”If you want to understand the vision and leadership that drives our company, then I encourage you to take time to listen to the full interview:
We live in a world of seemingly endless choices when it comes to which brand of t-shirt to buy, what to eat for dinner, or which route to take as you commute to work. According to psychologists, adults make an average of 70 conscious decisions each day, with unconscious decisions numbering in the thousands. It can quickly become overwhelming. And those everyday decisions are commonplace, even mundane! For IT decision-makers tasked with keeping the modern data center operational and secure, what may at first seem like a simple decision quickly takes on monumental significance.Consider the decision of which hardware vendor to buy from when implementing a server refresh or adding server capacity to the data center. Business leaders push for increasing service levels from IT, but often without a proportional increase in resources. The contradiction leads to pressure on IT decision-makers, forcing them to make tough purchasing choices. The decision to choose a hardware provider versus a hardware partner has vast implications when it comes to building a secure data center. It cannot be taken lightly.Taking a cheap approach to hardware may significantly increase the total cost of ownership. Cheap hardware often requires earlier replacement and lacks scalability. Most importantly, white box hardware providers don’t take responsibility for firmware and hardware security on the server, leaving the business more vulnerable to malicious attacks. Dell EMC and Enterprise Management Associates (EMA) provide guidance to discerning between a hardware partner (i.e. security leader) and a hardware provider (i.e. security laggard) in two recent white papers on hardware/firmware security. Here’s your quick guide – via infographic – on how to tell the difference.Dell EMC is a leader when it comes to hardware and firmware security. PowerEdge servers are embedded with integrated firmware and hardware security features like the dual silicon root of trust, BIOS protection and recovery, and hardware intrusion detection. If you go with a server provider who doesn’t offer hardware and firmware security, you may be left incurring unforeseen costs to integrate those protections after the fact. According to EMA, “It is much more difficult to address server security after deployment and implementation. Sever security should be carefully considered from the initial planning phase.”If you’re unsure how to figure out which server vendors are leading when it comes to security, Dell EMC’s white paper “End-to-end Server Security: The IT Leader’s Guide” is an excellent resource. The paper provides a short list of four questions you can ask each server vendor when making the crucial decision of whom to buy from. EMA also provides perspective in their white paper, going as far as listing examples of companies they consider “hardware providers.”The server purchase decision is business-critical, but it doesn’t have to be overwhelming. Using hardware and firmware security as a driving factor can make your decision simpler and save money and hassle over the long term. Guidance from trusted industry leaders should inform your decision. Even if you don’t choose PowerEdge servers, you can choose to be an informed consumer. The white papers linked below are an excellent starting point.Server Security Resources: End-to-end Server Security: The IT Leader’s Guide, Dell EMC Business White PaperCyber Resilient Security in 14th generation of Dell EMC PowerEdge servers, Dell EMC Technical White PaperDell EMC PowerEdge Servers: Investing in a Cyber-Resilient Architecture, Enterprise Management Associates White Paper EMA Security WP
In today’s age of digital disruption, one of the greatest challenges that companies face is the need to keep up with evolving technology. Speed and agility are key to a successful IT transformation, and organizations that can handle transformational workloads, such as artificial intelligence (AI) and cloud-native applications, have a significant advantage over those that can’t.On one end of the spectrum are the innovative companies, with modern data center infrastructure and IT automation in place. On the other end are older businesses with slow, outdated process. A recent ESG study focused on the differences between them describes these two stages as “modernized” and “aging.”One key difference between the two stages is that aging companies typically prioritize predictability and reliability, while modernized companies prioritize speed and agility.  In the past, IT departments were solely focused on traditional workloads like website, email, file, print, etc. They had to keep the basics up and running for the business to function. But the industry is shifting, and companies needs change as they move through their IT transformation. While reliability will always be important, the status quo is no longer enough. Today’s modernized companies must focus on speed and agility, so they can quickly process the enormous amounts of data that these transformational workloads require.The same ESG study identified another key difference between modernized and aging companies: the use of modular servers in their infrastructure. ESG found that modular servers make up an average of 20% of a modernized company’s total server infrastructure, compared to only 5% in aging companies.1 That’s a significant difference, and plays a huge role in setting modernized companies apart from their aging counterparts.How Modular Helps with Transformational WorkloadsBecause modernized companies favor speed and agility over predictability and reliability, they need to make sure they have modern data center infrastructure in place. New, data-intense workloads such as AI and ML have different hardware requirements. Modular servers can play a critical role here, because they are flexible, agile and easy to manage. Modernized companies understand this need, which is why so many already have a modular compute strategy in place.How Does Modular Help with Transformational Workloads?Modular infrastructure combines server, storage and networking – along with unified management software – so that users can easily tailor workloads and expand over time. It can meet the needs of both traditional and transformational workloads by providing the following benefits:Increased Scalability – Modular servers give you the flexibility to adjust resources to deliver the compute, storage, and network performance needed to accelerate both traditional and transformational workloads. In fact, an ESG study found that 57% of modular server users reported increased scalability benefits to the organization. 1Easier Management – Users can automate the management of compute, storage and networking resources with integrated, easy-to-use tools and spend less time on routine maintenance. Modular servers improved manageability for 50% of surveyed IT organizations. 1Faster Deployment – Modular infrastructure helps accelerate your time-to-value by quickly deploying traditional and transformational workloads. ESG found that the average benefit was a 35% reduction in deployment time among modernized organizations using modular. 1Improved Reliability – Users can adapt and respond with non-disruptive upgrades and minimal downtime. Modernized IT organizations are twice as likely as aging orgs to experience higher reliability with modular compute. 1Decreased OPEX – Modular is the original “pay as you grow” model, because it allows you to purchase only what you will use now, then add to it as your needs change. The average reduction in procurement costs by purchasing modular servers (compared to alternatives) was 32% among modernized organizations using modular. 1Once an organization has the right infrastructure in place, it can more easily adopt transformational workloads. These innovative technologies help companies save time, increase productivity, decrease operating costs, and increase revenue. Meanwhile their competitors will be left further and further behind. Aging companies simply can’t offer the same services or customer experiences and ultimately run much less efficiently.No matter what state of IT transformation your company is in, it’s worth considering whether modular servers can take your business to the next level. To learn more about modular infrastructure, read ESG’s full white paper Insights from Modernized IT: Modular Compute Can Have a Big Impact. Source: ESG White Paper Insights from Modernized IT: Modular Compute Can Have a Big Impact, commissioned by Dell EMC, August 2018
As we come up on the end of the 2nd decade of the new millennium and I begin my third year as the product manager for Dell’s Wyse ThinOS thin client firmware solution, I wanted to take some time to reflect on the history of this industry game changer. I will also take a small peek forward as we prepare the ThinOS platform for relevance in the emerging world of “cloud first” application architectures that are quickly taking their place alongside the traditional VDI environments popular today.When I entered the computer industry, IBM mainframes ruled the world with challengers such as Digital Equipment Corporation, Hewlett Packard, SUN Microsystems or any number of Un*x  based challengers all offering their vision of IT to the enterprise. The only thing most of these systems had in common was the ubiquitous terminal with a CRT and keyboard that allowed users to access the central system interactively – no punch cards or IT staff support needed!But the 1990’s changed everything. Business users frustrated with the slow pace of application deployments and a seemingly endless backlog of requests started wresting control from their central MIS departments and began deploying low cost, yet very powerful, PC’s running Microsoft operating systems. They were buying off the shelf software or even hiring their own programmers to satisfy the insatiable desire for new applications. And what wonderful applications! MS DOS quickly gave way to MS Windows and opened the use of complex graphics to simplify user interfaces. Add a mouse as an input device and the expectations for computer application design changed forever.But, as with many new capabilities, there was a dark side. While users rejoiced in their powerful new applications and their easier and more intuitive interfaces, the management of it all became orders of magnitude more complex. What used to be a small set of very powerful and well managed computer systems exploded into hundreds and even thousands of small machines scattered all over organizations. Users quickly demanded that their IT support staff administrate and operate these systems thus creating an even messier operational environment than ever before. By the mid 90’s the industry was ripe for yet another shift, as the market searched for a way to bring control back to the enterprise. With the introduction of the WinFrame multi-user operating system solution from industry pioneer Citrix Systems, the march to recentralizing the client landscape began anew. Microsoft then absorbed these technologies into their Windows NT family and brought centralized MS Windows based computing into the mainstream for enterprise IT.It was into this environment that the Wyse Winterm devices were born. A solution that provided graphical terminal support for the Citrix WinFrame and Microsoft Windows NT “windows mainframe” system began to take hold in the market. These Winterm devices aimed to deliver connectivity to these new classes of applications running under the Windows environment while providing IT end user admins a cost effective and easily managed end-point to deliver access. These initial devices were burdened by complex operating systems of their own, with options such as WindowsCE or even Linux being used to power these new clients.It was in this environment that Mike Liang of Wyse Technology locked himself in a lab, known as Area 51, to build out a completely new operating system dedicated to thin client devices that was released as Wyse Blazer, or ThinOS as we know it today.Launched to the market in early 2000, it changed the game by essentially offering a terminal style device with the ability to access the modern windows applications using graphical displays, keyboard and mouse. This approach was unique in that Wyse developed a platform designed from the ground up to power terminal devices versus taking a full function operating system and attempting to restrict it.Notable ThinOS based devicesBy powering these devices with what is essentially firmware – Wyse was able to bring the security and manageability of terminals to the IT end user admin, while allowing them to offer their users full access to the modern applications they demanded. A new era was born, and it powered the Wyse brand to a leadership position in the enterprise client market. Wyse came to dominate this segment by establishing close partnerships with Citrix Technologies as they continued to pioneer the remote application access solution space and then growing along with leaders like Microsoft and VMware as they evolved their own solutions. Over time, this basic solution expanded into ever more complexity with VDI, server-side GPU acceleration and other technologies which now enable almost any user need to be satisfied from public or private cloud environments. Wyse’s approach of creating what is essentially an end user access appliance with rock solid security and a focus on the TCO powered tremendous growth leading to the acquisition of Wyse by Dell in 2012.It’s these same design tenets that continue to drive the evolution of ThinOS to this very day, but as is always the case of IT, nothing stays put for long. In the last several years we have seen explosive growth in cloud-based computing with non-windows applications being developed under native Web first technologies built around HTML5 browsers and internet API’s that essentially will render the notion of a Windows Desktop as outmoded as we now consider the old VT240 style MS-DOS displays.As the current product manager of the ThinOS product line I am mindful of the decades of history, the talented engineers and technologists that created the operating system and most importantly the tens of thousands of fanatically loyal customers with millions of end users that depend on this solution each and every day. The pace of change in the application space as well as the solutions being delivered by our virtualization partners in undergoing rapid evolution and it is imperative for our firmware to adapt, but not at the expense of the core attributes that make ThinOS the premier thin client platform in the industry to this day.ThinOS remains the crown jewel of Dell’s thin client offerings. We are looking ahead now, with the support of our current partners at Citrix, VMware, and Microsoft as well as cloud providers such as Amazon, Google and MS Azure. Our core values of security, manageability and excellent user experiences will remain our touchstones as we move into the world of cloud first applications.With everything going on in this space, one thing that will not change is the service, dependability and reliability that has come to define ThinOS. Note: At DEC we were taught never to fully spell out a dirty word
NEW YORK (AP) — The moped-sharing company Revel says it is building a charging hub for electric vehicles in Brooklyn this spring. Revel officials say the charging facility will be the first in a network of car-charging hubs planned for New York City. The initial charging hub will be located at the site of the former Pfizer building in the Williamsburg section of Brooklyn and will have 30 stations capable of delivering 100 miles of charge to vehicles in about 20 minutes. The charging stations will be available 24 hours a day to drivers of any type of electric vehicle.
SALT LAKE CITY (AP) — A theme park in Utah has filed a lawsuit against Taylor Swift that accuses her of trademark infringement. Evermore Parks said in its suit filed Tuesday that the title of Swift’s 2020 album “Evermore” violates the park’s trademark rights. Swift’s lawyers say the allegations are baseless and they refuse to comply with a cease and desist letter the park sent to Swift on Dec. 18. They added that the singer-songwriter styled her new album in a way that is entirely distinct from the park’s aesthetic. Evermore Park was created in 2018 and features costumed actors and performers.