Many, if not all of us, are required to work remotely. The ability to complete our work while on the go is essential to not conceding deals to our competitors. The need for sales and professional services resources to urgently access company data from outside the office is more prevalent than before, as is the need to ensure remote access is secure.
The University of Nevada, Reno on August 2 unveiled development of a new, user-driven, high-performance computing cluster that will boost research capacity and better support the latest research applications such as artificial intelligence, machine learning, robotics, computational biology and neurosciences, bioinformatics and big data.
The Biggest Little City just scored a big milestone. The first building in Switch’s Citadel Campus officially opened for business at the Tahoe Reno Industrial Center east of Reno-Sparks earlier this month. Called “Tahoe Reno 1,” the 130-megawatt facility is one of several data center buildings planned for the project.
About a decade ago, a handful of Google’s most talented engineers started building a system that seems to defy logic. Called Spanner, it was the first global database, a way of storing information across millions of machines in dozens of data centers spanning multiple continents, and it now underpins everything from Gmail to AdWords, the company’s primary moneymaker.
The use of broad data sets provides corporations with the ability to crunch tremendous amounts of customer data to identify trends and make good decisions. For a tiny handful of cutting-edge companies, the same disciplined use of data for day-to- day decision-making provides a solid pathway to success. Bristlecone Holdings, a fast-growing financial technology company headquartered in Reno’s Midtown District, demonstrates the wide-ranging importance of using data as a business tool. The company’s very existence depends on data utilization.
I can remember installing Doom on one of my first computers. I recall opening the box and being showered in something to the tune of fifteen 1.44 megabyte floppy disks. Technology has come a long way since then. The 16 gigabyte thumb drive on my key chain, for example, has the same capacity as 11,377 of those floppy disks, which is a nice improvement! With that increased storage capacity comes the potential for huge sums of data that may be absolutely critical to the function of a business. A network attached storage device at home might hold thousands of songs or priceless pictures of your kids when they were young.
For the past two years, I’ve tried to show—in a simple way—how big data is different. I attribute the complexity of big data to two primary reasons. The first being that you need to know 10 to 30 different technologies, just to create a big data solution. The second reason is that distributed systems are just plain hard.