Tuesday, September 30, 2014

Happy Birthday 2014 Analysis

Today (9/30) is my birthday. I thought I would break down the various birthday wishes and make some graphs, cause I love me some graphs.

Some numbers

  • Total of 41 Happy Birthdays (HBDs) received
  • Earliest Bday wish received 9/29/2014 @ 8:12AM by my friend Wendy. Classic Wendy wishing me a happy bday on the wrong day. Better way early than never.
  • HBDs started at 5:30AM on 9/30. First HBD received on Facebook.
  • Royals playing first Postseason game in 29 years tonight. What a game...just had to mention that.

Hour of Day Break down

  • Huge spike in HBDs during the 11:00AM hour.
  • People eating dinner around 5:00PM this evening, no HBDs received.

Who Wished

Most HBDs were received by Facebook Friends denoted by FFriends. Facebook Friends are those folks that you only interact with via Facebook. These could be friends from days past, acquaintances, or just straight up Facebook Friends.

Friends here are people I consider closer than FFriends. These are friends that I maintain close contact with, the type of people that would help you hide the body if needed. Of course there is only one wife in this category.

  • Most common family member wishing HBDs was cousin with 3 HBDs wished.
  • Most popular name Matt with 3
  • Names beginning with M come in first with 6 HBDs

Method Received

  • 72% of all HBDs were received via Facebook. Friends and FFriends alike use the common social media platform. 
  • My parents used the good ol fashioned phone method. It was good to talk to them, they are out of town traveling around. 
  • Coop and Val wished me a happy birthday with their voice.
  • I received 5 cards in the mail

Thursday, September 25, 2014

Ola Hallengren Command Log History Scripts

If you are like me and use the Ola Hallengren set of free scripts to help manage backups, checkdbs, and index maintenance operations then you may find yourself needing to dive into the Command Log history to calculate the duration of some of your common operations. See below for some scripts I wrote to query the data in the CommandLog.

Command Type Duration by Days of the week

I was trying to track down which indexes were being rebuilt on what days. The Index job was taking longer on some days then others and I was looking for a trend. This stored procedure will show you all the ALTER_INDEX commands (or whatever other commands you want to see) by day of week. This will help you get an idea of what indexes are reorganized or rebuilt on what days.:

This will show you the duration for each day of the week. If you pass the optional DaysBack parameter as 7 it will show you the history for the past 7 days broken down by day of week.


Select statement to show all the indexes that were included in maintenance on a specific date. I wanted to see what indexes were modified today so I used the script below.

CommandType Count Script

I needed a quick SELECT to show the number of time an index had been reorganized or rebuilt.

Wednesday, September 24, 2014

SQL Server User Security Report Script

I needed a way to report on what permissions are assigned to a user on SQL Server. For added bonus I wanted to be able to specify a group and then list out all the users in that group with their permissions. As a double added bonus I wanted to set a flag so you could generate the DDL for a new user in case you want to duplicate the permissions to a new user or group.

Bingo. First version here. I am sure there are several versions of this on the web, but at the moment this one is working well for me.

Meeting and Email Cost Calculators

I don't know about you but I hate meetings and useless emails. So much time wasted in the corporate world today on meetings and emails. How do we combat this huge waste? I think one of the first steps is awareness. One quick and easy way to make someone aware of the waste involved in emails and meetings is to quantify it. I created some calculators to quantify meeting and email waste.

I present the Improvidence Monday-Friday 8 to 5 web site: http://www.thejoestory.com/imf85

  • Work Day Efficiency Calculator - Calculate how efficient your work day is. This calculator takes your meeting time and time spent on useless emails to quantify how much of your day is wasted on such mind numbing tasks.
  • Meeting Cost Calculator - Calculate how much a meeting costs the company in lost productivity time.
  • Email Cost Calculator - Quantify how much money that awesome reply all thread costs the company in time lost.

  • Here is a screen shot from a recent email thread I was apart of.

    Friday, September 19, 2014

    PowerShell Parse Remote Event Log Email Results

    I needed a quick and easy way to parse the System Event log for a specific Event ID. If the Event was found in the last 24 hours, send me an email.

    Here is what I came up with: The code will look for Event ID 26 occurrences within the past 24 hours. I limit the amount of entries parsed to 200 using the -Newest property. This saves a lot of time on the parse. If you set this thing to parse the entire system log you will be waiting a while. If we find an event email me.

    Trash Tracker

    Was walking around the park on one of my speed walks when a bolt of inspiration hit me. It was Monday and Monday is trash day at Westwood. One thought led to another and I started thinking why am I not tracking my trash yet.
    Over lunch break I mocked up TrashTracker v 1.0. I just needed something simple, something easy to use from the mobile, with plans to use it Thursday mornings as I am taking out the trash. 
    I first decided to track the bag count and quickly realized this is not the most optimal. We always have 2 bags of trash. So then I added the following attributes for 2 bags of trash:
    • Weight - I am using a simple bathroom scale at the moment with plans to find a better way to weigh the trash
    • Smell - purely subjective rating from 1 - 10 on smell. 10 being the most putrid bag of trash you have ever come in contact with.
    • Garbage water - A simple yes or no question to determine if the bag busted and leaked garbage water on the floor or worse on your socks and shoes.
    I also added a notes section for various notes, and of course we track the date and time of the record.
    I used jQuery mobile for the site. Here is a screen shot. Again I just needed something simple to collect the data, I am more interested in the data.
    So I recorded the first night worth of trash. Here is the breakdown.
    Bag 1
    • Weight: 35 Pounds
    • Smell: 9/10
    • Garbage Water: No
    Bag 2
    • Weight: 18 pounds
    • Smell: 3/10
    • Garbage Water: No
    What a great night to start the data collection. Bag 1 was horrible. We had to put it outside 3 days ago. The smell was so putrid it permeated into the house from the garage. The house started smelling like garbage. I had to take the bag out to the back porch and leave the garage door open today to air it out.
    So there you have it. Another data set to collect and analyze. I would like to capture data for a year. We’ll see how long it goes.

    First Run Food Inspections

    Food Inspector reports in KC are available on a dreadfully designed web site that doesn’t allow much room for creativity, as in you cannot download the data. So I started taking a run at the data with my new-found Python Web Scraping skills.
    The map below is a first run at the data. i realized that my attempt to grab all the restaurants was unsuccessful. So I need to go back and tweak that a bit. However the first run report was pretty cool and worth the post: 
    Full screen link here: http://thejoestory.com/687182
    I am using Google Fusion tables which is gangsta if you have a ton of addresses you want to map and/or geocode. Google did all the work without any pesky API calls needed and such. You just tell it what column contains your location data and Google Does the rest.
    On the map you will notice the dots are color coded.
    • Green dots are places that have less than 5 critical violations.
    • Yellow dots are places with 5 to 15 critical violations.
    • Red dots are places with over 15 critical violations
    Now remember there are varying levels of criticality. I am working on grabbing the rest of the restaurant data. I am also working on a way to let you drill down and actually read the report to see how critical a critical violation is. Good times.

    Name Data

    Here is the first shot at working with name data from the Social Security web site.
    The site allows you to enter in a list of first names + the gender like so:
    Then the site will chart out the popularity trend of the names. You can do a single name as well by just entering Name|Gender.
    The data goes all the way back to the late 1800s however the older data is less accurate than current data since Social Security Number usage history is not consistent.
    Using the relative risk calculation I am working on showing year to year increasing and decreasing spikes in popularity for given names. For example the sudden decrease of the name Katrina after 2005.
    More to come on this, however in the mean time it is kinda cool to plug names in and try to figure out what caused the increase or decrease in the name.

    Update: 9/19/2014

    I changed the layout a bit and added some more stats including highest year to year increase and decrease of a name. Therefore you can track the trends of a name you specify.

    Missouri Schools 2014 Annual Report

    Our local NBC affiliate reported that the 2014 APR for Missouri schools had been released for review. Well I like education and I like numbers so naturally I had to click on that story. I wasn’t surprised when I clicked on the link to the report to find the report in PDF format…boo. C’mon people tabular easy to consume data is what we want.
    Here is a quick summary of the APR guidelines. I do not really understand it all and more information can be found on the Missouri Department of Education web site. Basically each school can earn X amount of points (varies from school to school). Then school is graded using the APR guidelines to produce a total points earned score.
    So the first step was to grab the data and clean it up into a usable format. I thought it would be cool to grab the addresses for all the School Districts as well that way we could throw the schools on the map. With a few minutes of searching I was able to dump all the school district addresses into a CSV. From there it took some dynamite copy/pasting skills to merge the two data sets. Final result was this spreadsheet: School Performance.xls
    Now the easy part, convert the xls into a csv file, throw the csv into a Google Fusion Document, and let Google do all the geocoding work for me. Now we can slice and dice the data however we want.
    I created a couple quick maps using the data.
    Here is a map charting the difference between 2013 and 2014 percent of total points earned. This shows schools that have increased or dropped from their 2013 performance report numbers.
    The Second map here is color coded by percentage of total points earned. I am still tweaking the color coding to match the data set. 
    Anyway, sending this data out to my educator friends. I may slice the data up a little more depending on the feedback.
    Here is a link to the Google Fusion Document used to build the maps: 
    If you want a map created or want the data sliced a certain way, holler. 

    Blink(1) first project

    I have been waiting for about 3 years to purchase a Blink(1), and just about they are out of stock when I go to purchase one. I stumbled over to the website last week and noticed they were in stock so I quickly scooped one up.
    So what is the blink(1) you may be asking. It is a set of ultra bright multicolor LED lights that you plug into USB. Using the control software and some handing scripting skills you can setup a wide variety of rules to make the blink device flash colors. Well wooptie freakin doo Joe, I still don’t understand what is so cool about this.
    How about we just take it for a test spin. The blink(1) has its own IFTTT channel. IFTTT is a site that allows you to create a wide array of rules, apply them to various web services and/or web connected devices, and perform tasks. For instance let’s look at one of the  most popular blink recipes (recipes are what the site calls these trigger/action rule sets). You can set the blink(1) to notify you when you get a new email on Gmail.
    • First you activate the Gmail and blink(1) channels in your IFTTT account.
    • Then you can browse the recipes for blink and add the new Gmail rule recipe
    • Once you setup the rule in IFTTT you use the Blink(1) control software to add the rule to the blink
    • You can setup various properties in the Blink(1) control screen that control what colors and patterns to use for the blink notification.
    • Now you sit back and wait for the rule to fire
    Ok so Joe the last thing I need is another notification about email. I agree, this is just a simple example about how to use integrate the blink(1) with IFTTT and is just scratching the surface of the capabilities of the device.
    Ragweed season is a wonderful time of year at team story’s house. I thought it would be cool to notify using blink(1) when the pollen counts were high. Using Python Web Scraping skills we can find today’s pollen count, determine if the count is high, and then write a color to a text file. Using the Blink(1) control center we setup a rule to monitor a text file for changes. 
    • First I reviewed this web site for web scraping opportunities: http://www.wunderground.com/DisplayPollen.asp?Zipcode=64157
    • I then created the Python Script (pasted below) to parse the page and find today’s pollen count
    • If the pollen count is > 9.0 I want to set the text file with the notification pattern fireengine. There are several built in patterns out of the box, and you can create your own customized patterns.
    Check out a video of the high pollen count in action: http://youtu.be/X_fO_9uEX2c. The cool thing about file notifications is that the blink(1) control software keeps track of the file modified date so if the file has not changes it will not fire off the notification. You will only get notified if the file is updated.
    Anyway, again this is just scratching the surface. I hope to get more time to explore and play with the new toy in the next few weeks.
    Pollen Python Script
    from bs4 import BeautifulSoup
    import urllib2
    import urlparse
    import datetime
    f = open('C:\Python27\projects\pollen\outfile.txt','w')
    errorFile = open('C:\Python27\projects\pollen\error.txt','w')
    soup = BeautifulSoup(urllib2.urlopen('http://www.wunderground.com/DisplayPollen.asp?Zipcode=64157').read(), 'html5')
    tableStats = soup.find("table", {"class" : "all-clear pollen-table"})
     for row in tableStats.findAll('tr')[1:]:
      col = row.findAll("td", {"class" : "levels"})
       if not col[0].p.string:
        levels = "0"
        levels = col[0].p.string.strip()
       if float(levels) > 9.0: 
        f.write('pattern: "fireengine"')
      except Exception as e: 
       #errorFile.write (str(e)+'**********'+str(col)+'\n')
    except Exception as e: