This is James Ruddy's Typepad Profile.
Join Typepad and start following James Ruddy's activity
Join Now!
Already a member? Sign In
James Ruddy
Litchfield CT
Technologist
Recent Activity
Image
In my previous posts I whowed how to deploy, discover physical arrays, create virtual arrays and pools, and create and authentication provider. Next I will show how to deploy and configure object data services. From the web GUI select settings, data service nodes The data services nodes screen pops up.... Continue reading
Posted Jul 11, 2014 at TheRuddyDuck
Image
Having now set up a physical and virtual array (HERE) we next need to create an Authentication provider to validate logins by tenants to ViPR services. From the ViPR web GUI click security Authentication Provider Note that a user has to be setup to provide access to LDAP. See this... Continue reading
Posted Jul 11, 2014 at TheRuddyDuck
Image
Having deployed ViPR and discovered physical arrays the next step is to abstract them into Virtual arrays and Pools for consumption. Below youll see examples using our Isilon array, From the left side select Virtual Assets and then select Virtual Array On the virtual Array screen click Add give you... Continue reading
Posted Jul 11, 2014 at TheRuddyDuck
Image
After deploying the ViPR controller the next thing to do is add storage resources. This is done through a discovery. Discover Isilon File From the dashbard, on the left is a series of buttons. Under the dashboard button is the Physical assets. Click it and on the pop out select... Continue reading
Posted Jul 11, 2014 at TheRuddyDuck
Image
With the launch of EMC ViPR 2.0 I figured i would update my install in the lab and publish my notes here. The basics are still the same as 1.0 and 1.1 but the interface is slightly differnt. This is the first in a series to deploy and configure ViPR... Continue reading
Posted Jul 11, 2014 at TheRuddyDuck
Image
This is the last blog of a series that showed how to deploy, configure and use the Pivotal Cloud Foundry Runtime environment and developers console. Runtime Developers Console One of the cool parts of CF is the ability to install services, Hadoop, Mysql, Mongodb, and then deploy applications that attach... Continue reading
Posted Jun 30, 2014 at TheRuddyDuck
Image
MongoDB World Day 1 I had a chance this week to attend the first ever MongoDB world held in NYC June 24th and 25th. This blog will bring together my notes from the 2 days spent attending the conference and sessions. To start the day we were greeted by a... Continue reading
Posted Jun 25, 2014 at TheRuddyDuck
Image
In the last couple of blog post I showed how to setup PivotalCF Operations Manager, Elastic Runtime, and setup the PHD service. After setting up our enviornment we would then want to login to the developers console to deploy applications. While setting up the elastic runtime, we created a wildcard... Continue reading
Posted Jun 25, 2014 at TheRuddyDuck
Image
This blog will show you how to deploy the PivotalCF Hadoop service using Operations manager. Once the service is deployed you can use the developers console to launch on demand hadoop clusters using the PivotalCF framework Login to the operations manager console using a web client On the left side... Continue reading
Posted Jun 25, 2014 at TheRuddyDuck
Image
In the last 2 blog posts (here, and here) I showed how to deploy and configure Pivotal Cloud foundry operations manager and the operations manager director for VMware vSphere. This log will show you how to deploy the Pivotal Cloud foundry elastic runtime environment. Elastic runtime is the framework that... Continue reading
Posted Jun 12, 2014 at TheRuddyDuck
Image
In my last blog post I showed how to deploy Pivotal Cloud Foundry operations manager. Once it’s deployed we have to configure it. Pivotal CF Operations Manager is a web application that you use to deploy and manage a Pivotal CF PaaS. It does it's deployments using BOSH. BOSH installs... Continue reading
Posted Jun 12, 2014 at TheRuddyDuck
Image
Cloud foundry is an open source PAAS project that gives a user the ability to deploy platforms across multiple “cloud” platforms like Openstack, VMware, vCHS, and AWA. Pivotal CF is the enterprise version of this that has 2 main componets to enable PAAS: Pivotal CF Elastic Runtime Service – A... Continue reading
Posted Jun 12, 2014 at TheRuddyDuck
This is a continuing series on how to build a data lake. Welcome to part7 Part1 Part2 Part3 Part4 Part5 Part 6 Over the past couple of weeks Ive been blogging on how to create a data lake. These blogs included the architecture and how to install of a Data... Continue reading
Posted May 21, 2014 at TheRuddyDuck
Image
OpenStack Cinder and Software-Defined Storage (SDS) So a week after the OpenStack Summit in Atlanta kicked off I’ve had some time to digest all I saw and heard. Having had the chance to present with John Griffith, the PTL of Cinder, was an amazing experience. John recently published a blog... Continue reading
Posted May 19, 2014 at TheRuddyDuck
Image
This is a continuing series on how to build a data lake. Welcome to part6 Part1 Part2 Part3 Part4 Part5 GemFire XD is provided as a Pivotal HD installable component, for use with the Pivotal Command Center CLI installer. The CLI installation process installs multiple instances of GemFire XD. You... Continue reading
Posted May 16, 2014 at TheRuddyDuck
Image
This is a continuing series on how to build a data lake. Welcome to part5 Part1 Part2 Part3 Part4 In this blog post I’ll show you how to enable Isilon to integrate with Pivotal Hawq. In these previous posts I explained the architecture and install of our data lake. At... Continue reading
Posted May 15, 2014 at TheRuddyDuck
Image
Openstack Summit Atlanta session I was very fortunate to have the chance to present with Ken Hui and John Griffith in a session tilted Laying Cinder Blocks (Volumes) Use Cases and Reference Architectures this week at the Openstack summit. The session was standing room only and it was a great... Continue reading
Posted May 14, 2014 at TheRuddyDuck
Image
This is a continuing series on how to build a data lake. Welcome to part4 Part1 Part2 Part3 The control center server (PCC) will push all the software and configuration information to our PHD nodes, Hawq master, and hawq segment servers. Create a temp directory and upload the binaries to... Continue reading
Posted Apr 24, 2014 at TheRuddyDuck
This is a contining blog series on How to build a data lake Part1 Part2 In part 2 I showed the architecture we are building for a data lake. In this blog I will begin to show how to deploy and integrate it all together. We’ll start with the base,... Continue reading
Posted Apr 22, 2014 at TheRuddyDuck
Image
ViPR HDFS is a POSIX-like Hadoop compatible file system (HCFS) that enables you to run Hadoop 2.x applications on top of your ViPR storage infrastructure. You can configure your Hadoop distribution to run against the built-in Hadoop file system against ViPR HDFS, or any combination of HDFS, ViPR HDFS, or... Continue reading
Posted Apr 15, 2014 at TheRuddyDuck
Image
ViPR HDFS is a POSIX-like Hadoop compatible file system (HCFS) that enables you to run Hadoop 2.0 applications on top of your ViPR storage infrastructure. You can configure your Hadoop distribution to run against the built-in Hadoop file system against ViPR HDFS, or any combination of HDFS, ViPR HDFS, or... Continue reading
Posted Apr 15, 2014 at TheRuddyDuck
Image
ViPR HDFS is a POSIX-like Hadoop compatible file system (HCFS) that enables you to run Hadoop 2.0 applications on top of your ViPR storage infrastructure. You can configure your Hadoop distribution to run against the built-in Hadoop file system against ViPR HDFS, or any combination of HDFS, ViPR HDFS, or... Continue reading
Posted Apr 15, 2014 at TheRuddyDuck
Today EMC launched the Hadoop starter kit (HSK) ViPR edtion. These kits are designed to help deploy a Hadoop enviornment and use EMC ViPR as a hadoop compatiable file system for HDFS. There are 3 seperate guides that each focus in on how to deploy ViPR data services, create an... Continue reading
Posted Apr 14, 2014 at TheRuddyDuck
Image
In my previous post I shared the origins of the Data Lake pilot within the EMC Open Innovations Lab. Based off that criteria we decided we needed to build an new analytics environment that would allow for real time data processing and the ability to compare it to historical data.... Continue reading
Posted Apr 1, 2014 at TheRuddyDuck
So I’ve seen a lot of blogs recently talking about the Data Lake. What it is and what it means. My favorite has been Steve Todd's which gives a good high level over of what a data lake is. In the EMC open innovations lab (OIL) we are constantly working... Continue reading
Posted Mar 27, 2014 at TheRuddyDuck