World of Warcraft - Classic

World of Warcraft - Classic

Oh man, classic World of Warcraft is awesome. I love having the hardware and knowledge to run fun projects like this.

This is perhaps the only reason I purchased a Wii U when they originally came out. I cannot wait to get my hands on this game. It's going to be fantastic.

These videos continue to impress me. How awesome would it be to see The Ocarina of Time be redone in completely in Unreal Engine 4?!

The Division Beta - vtequine

I've been having an awesome time playing through the beta for The Division over the last couple days. The beta is much better and much more refined and polished then the alpha. I have managed to actually complete missions on my own, match making and teaming up with other players is also much easier. I also made my way into the Dark Zone and have gotten some new weapons and gear. One thing I do wish was that there were a few more missions outside of the Dark Zone.

I am definitely looking forward to the final version of The Division come March 8th. I've put in my pre-order for the Gold Edition which includes a season pass and some other fun goodies. I hope everyone enjoyed playing through the beta, there's only about 18 hours left so I'll probably log a few more hours! 🎮

This is definitely the funniest thing I've seen in a while. 😝 😝 😝

Had a great time playing through this game. Who doesn't love horseback riding and space shuttles!?

Attack on Titan - Live Action Movie

I cannot wait for the second half of this movie to come out... 😍 😍 😍

Horseback riding

Houston Traffic

Now why can't traffic be like this all the time! 😝

I've been working hard to roll out the next version of my video game website, and today I finally pushed it out to the live site. I had some trouble with the Bootstrap date picker scripts, but I finally got them working. I was pretty much dreading working on the site because I know so very little Javascript. I probably could have rolled out the site about a month ago if I wasn't being such a big baby about this small piece of the site.

Anyways, here's some screenshots!

Destiny - Game Profile

This is what the game profile page looks like. I also got the Recently Completed By and Recently Favorited By blocks working a lot better. The rows and columns of the userpics display much better now.

jimmy's Profile

morgan's Profile

These are a couple examples of how user profiles look now. As you can see we now support a cover picture.

There's still a lot more we want to do we user profiles and the site. Items that I would love to get working this coming year would be:

  • Allow users to select and show the consoles they own on their profile
  • Show break downs of the types of genres and consoles users play on
  • Have some sort of friend feed
  • Allow users to setup and manage gaming groups
  • Possibly setup some sort of private messaging system
  • Review all the code again and make sure it's clean and tidy

Overall though I am definitely much happier with the site now. The only other thing we definitely need to be working on it getting all the games requested by users into the database. I think we have some 3500 pending requests at this point.

JSBin - Self Hosted

I've just started using this powerful tool. It also happens that it can be self-hosted which I am huge advocate for in any web software. I generally use it to tinker around with HTML and CSS as I'm not that good with Javascript but I could definitely see it being useful when I learn more about that language.

Now I just need to keep looking for a self-hosted solution that allows for real-time collaborative code sharing. And not a full on IDE, just something with nice syntax highlighting for multiple languages that lets me work on code in real time with a friend or co-worker.

My JSBin Instance.

StackEdit Self-Hosted

I recently brought on a new team member for one of my many projects. We had an immediate need for documentation collaboration. My first idea was to use Hackpad or Etherpad, but we were going to be working a lot with Markdown, so I Google'd around a bit and came across StackEdit!

As with most of our tools we use, I prefer self-hosted solutions more often then not. Thankfully StackEdit fit that bill.

My StackEdit system is a 4 virtual-core, 6GBs of RAM with CentOS 6.x VPS.

Install Prerequisites

yum -y install nano gcc gcc-c++ git

Install Node, Gulp & Bower

StackEdit runs off Node so we'll need to get that installed along with a few other npm packages.

cd /usr/src
tar zxvf node-v0.10.30.tar.gz 
cd node-v0.10.30
make install

Next Gulp:

npm install --global gulp

And finally Bower:

npm install -g bower

The StackEdit Source Code

This one is pretty quick and easy!

cd /opt/
git clone

Now we need to install some of it's dependencies.

npm install
bower install --allow-root

Of note, if you're not running these commands as root you may omit the --allow-root from the Bower command.

CouchDB - For Lazy Couch Potatoes!

Alright! So StackEdit will work now, however if you want to allow the synchronizing and sharing of files between people, you'll want to either setup Google Drive, Dropbox or CouchDB. I didn't have the time to setup Google Drive or Dropbox so I went to CouchDB. In fact anyone who uses StackEdit or your StackEdit instance can setup CouchDB to use with StackEdit.

I really recommend reading this documentation on how to setup CouchDB, but I'll give a brief overview here.

First go to SmileUpps. They offer free CouchDB hosting. Sign yourself up for an account. Then you'll want to setup a domain for your project.

Next you'll want to go into the CouchDB configuration to set a few items up.

CouchDB - SmileUpps

Click on the Configuration option from the right sidebar. You should see a new page with a table of configuration options. Scroll to the bottom and click on the 'Add a New Section' link and a little dialog box pops up. Enter in 'httpd' to the first field, 'enable_cors' to the second field and 'true' to the third field. Don't use the quotes of course though. Hit the Create button.


Again, go to 'Add a New Section' and enter in 'cors' for the first field, 'origins' for the second field and 'http://localhost,' for the third field.

Now back on your server you'll want to run the following commands:

curl -X PUT
curl -O
node setup.js

That should be all you need to get CouchDB going. I also recommend going to your StackEdit instance and update the CouchDB URL. Go to Menu > Settings > Advanced -- and set the CouchDB URL.

Finish Up!

Your StackEdit instance should be ready to roll. You can configure other items in the `` file.

One thing I had an issue was when sharing documents was that it kept using the domain which would likely cause some confusion. So in order to remedy this I had to edit the ./public/res-min/main.js file in 2 places. The first is line 13692:

var l = "" + "editor#!" + e.param(a);

and line 13703:

var l = "" + "viewer#!" + e.param(a);

I restarted the application and all was well!


  1. My instance currently is not connected with any of the third party apps like Dropbox, Google Drive or Google Analysts. I do not believe it would be terribly difficult to set these items up but I just didn't have a time or a real need.
  2. This is one of the first self hosted apps which I haven't had to proxy with Nginx. That's nice!
  3. I am not sure if the app should be running as root or a less privileged account.
  4. I also removed the donation links and alert boxes since I had no real use for them on my internal site.
  5. I also removed the link to Classeur since I had no real use for that being there.
  6. I also noticed it looks like you can deploy your StackEdit instance to Heroku or Docker, but didn't give those a shot. Maybe someday!

StackEdit Self-Hosted



I previously wrote about installing my own instance of Reddit but I haven't done a lot with. This post is really just to share a couple things I have been trying to work on to make the Reddit instance usable.

I've mostly just been working to try to get my instance to suck in text and link posts from the real, however I know very little about Python so I currently am only able to fetch some posts, but I am not sure how I get them into my instance as a specific user. Here's my script so far:


import praw
import logging


r = praw.Reddit(user_agent="Ubuntu:reddit.local:v0.1 (by /u/jimmybreddit)")

for submission in subreddit.get_new(limit=10):
    print "---------------------------------"
    print submission.title
    if submission.is_self == True
        print submission.selftext
        print submission.url
    print "---------------------------------"

I am going to keep working on it, and hopefully one day I can get my Reddit instance to pull in posts as a bot. 😄

Update - November 18, 2015: I've been working on my Reddit API script and this is what I've come up with:


import praw
import logging
import time
import calendar


fh = open("/home/username/Scripts/", "r")
previous =

r = praw.Reddit(user_agent='Ubuntu:reddit.local:v0.1 (by /u/username)', site_name='reddit')

for submission in subreddit.get_new(limit=50):
    if int(submission.created) > int(previous):
        if submission.is_self == True:
            rl = praw.Reddit(user_agent='Ubuntu:reddit.local:v0.1 (by /u/username)', site_name='local_dev')
            rl.login(username="$USERNAME", password="$PASSWORD")
            rl.submit('announcements', submission.title, text=submission.selftext)
            print "Submitting Text: " + submission.title
        if submission.is_self == False:
            rl = praw.Reddit(user_agent='Ubuntu:reddit.local:v0.1 (by /u/username)', site_name='local_dev')
            rl.login(username="$USERNAME", password="$PASSWORD")
            check = r.request_json("http://reddit.local/api/info/.json", params={"url": submission.url})
            if check == '':
                rl.submit('announcements', submission.title, url=submission.url)
                print "Submitting Link: " + submission.title

timenow = calendar.timegm(time.gmtime())
timenow = int(timenow) + 25200

fh = open("/home/username/Scripts/","w")

So what this script does is connects with and pulls the last 50 new submissions from a specific subreddit. It relies on a seprate file which stores the date of the last check and should only submit new submissions since that date. The date is stored in a Unix timestamp value. It also checks if the submission is a text post or link. If a link it checks the local installation to ensure the link hasn't been submitted to the subreddit before.

Now, I still had some issues with this, where submissions coming in from where posted in the future. I have no idea how that happened. Due to this, it resulted in duplicate posts to my local installation, but beyond this the script worked well.

Also of note, this script uses no authentication to and username and password authentication to the local installation. Obviously not ideal. It should be using OAuth2, but I didn't have time or patience to wrap my head around that.

Made with by Jimmy B.