Posted on

Improved Grunt Tasks for the Vagrant WordPress Dev Box

Grunt WordPress Dev Kit

Last week I found myself having to rebuild my WordPress plugin development box after a “laptop fiasco”.   While it was a lot of work it feels as though I am in a better position to not only recover my environment quickly but also distribute it to other developers that are interested in assisting with plugin development.

If you are interested you can read more about it in the related WordPress Workflow and related WordPress Development Kit articles.

This morning I realized that having a new almost-fully-configured Vagrant box for my WordPress Development Kit allows me to make assumptions in my Grunt tasks.    While it would be more flexible to create options-based tasks where users can set their own configuration for things like MySQL usernames and passwords, the WP Dev Kit Vagrant box assumption allows me to bypass that for now and come back to it when time allows.  Fast turnaround and fewer interruptions in my already-busy work flow is paramount this week.

Today’s WordPress Dev Kit Updates

The official tag I’ve assigned to the newest WordPress Dev Kit is version 0.5.0.  Here is what has been added.

WordPress Database Reset

One of the tasks I do fairly often is to “clear the data cruft” from my development box WordPress tables.  I  accomplish this by dropping the WordPress database and recreating it.

The Vagrant box makes this far easier as I know that when I spin up the WP Dev Kit Vagrant box it already has the WordPress MySQL tables setup.  I also know the username and password.  As such I can execute a simple drop/create table as the privileges are already in place in the meta data for MySQL and will carry over.   Thus I only need to execute a single mysql-cli command to get the data reset.

To get this working in Grunt I added the grunt-ssh module and created a ‘resetdb’ target.

I can now reset my WordPress table with a simple grunt command:


$ grunt shell:resetdb

Online Documentation

The other change I made today will help me remember how the heck all this stuff works.  Now that the dev kit has grown to a couple of commands I know I will soon be forgetting the nuances to certain build and workflow processes.   I started creating my own Markdown files I realized that Bitbucket has a system for using .md files on the repository wiki.    The easy solution was to add the Bitbucket wiki as a submodule to the WP Dev Kit repository and edit the file there.    Doing so means that any doc update will also be published immediately when pushed back to the repo at the WP Dev Kit Bitbucket Wiki.

Now back to getting the Store Locator Plus and Enhanced Results first-pass testing run and prerelease copies published for my Premier Members.

Posted on

Installing Sass on CentOS 6.5

SLP Sass Banner

I just discovered that Sass is missing from my WordPress Development Kit Vagrant box.   My Vagrant box is on the latest CentOS 6.5 release and, luckily, setting up Sass is very simple for vanilla CentOS 6.5 users.  It is literally a two-step process:

$ sudo yum install rubygems
$ sudo gem install sass

Now I can add this to the NetBeans executables configuration by adding the path /usr/bin/sass.

Configuring Sass in NetBeans
Configuring Sass in NetBeans

Now I can edit my .scss files and have the corresponding .css files auto-generated right from NetBeans.  Nice!

Posted on

WordPress Development Fresh Start with Vagrant and Grunt

Banner Vagrant Cloud WP Dev Kit Box

HP finally did it.  They came out to replace a system board in my laptop and half of my files on my drive are corrupted.  Restoring the 70GB of corrupt files from the local USB3 backup will take some time and I am not 100% confident in the reliability of this system.   With 48 hours of downtime I decided it would be best to push forward with my Vagrant and Grunt setup for the WordPress Development Kit I’ve been working on.

Employing Vagrant

I am using VirutalBox and Vagrant to build a series of base boxes on which to layer my development environment.  Unlike most base boxes that are plain vanilla OS installs that then use provisioners to install all the software layers on top of the OS, my custom base boxes are going to be installed with all the base tools I need to develop WordPress plugins using my WordPress Development Kit tools.

Why?

Because it is far faster to download a compressed virtual machine image with most of my stuff installed  than to download a very similar image with no tools then have Vagrant go and download  a few dozen install kits.   Sure, the standard method is more flexible and ensures everything is up-to-date, but that is not what I am trying to accomplish here.    I am building a “fast start” box that has most of what you need pre-installed in a known environment.

I also want to have a method to deploy new virtual boxes with everything in place on my OS/X system, a different laptop, or even a cloud-based server the next time my HP laptop is smoked.  Which will probably be tomorrow.      As I’ve found, restoring a large image even from a local USB3 drive is not a quick process and it is not foolproof.   Especially when going from a Windows 8.1 based backup and restoring on an OS/X system that has not been patched or updated in 8 months.

How…

Since I already have Vagrant installed and have a published base box I am able to get started quickly.   Once Vagrant is installed I only need to set my Vagrantfile, a script that tells Vagrant what to do, to pull down the base box from the URL I have published on Vagrant Cloud:

  • Create a folder on my host system named “C65 WP Devkit  Box Setup”.
  • Create the Vagrantfile, in that directory with the config.vm.box pointing to charlestonsw/centos6.5-wordpress-dev-kit-base-box.
  • Open the command line for Windows and go to that C65 WP Devkit Box Setup directory.
  • Run the vagrant up command.
My new Vagrantfile:

# -*- mode: ruby -*-
# vi: set ft=ruby :

# Vagrantfile API/syntax version. Don’t touch unless you know what you’re doing!
VAGRANTFILE_API_VERSION = "2"
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
config.vm.box = "charlestonsw/centos6.5-wordpress-dev-kit-base-box"
config.vm.provider "virtualbox" do |vb|
vb.gui = true
end
end

What…

When you boot the WordPress Dev Kit Base Box with Vagrant you will find that the full CentOS GUI is active. You can login to the GUI using the standard vagrant username with a password of vagrant.

WordPress 3.9

You will find a Firefox icon on the menu bar. Opening Firefox will open the http://localhost/ URL which brings up the local copy of WordPress 3.9 that has been installed.

MySQL for WordPress

The MySQL database has been configured for WordPress.

Database name: wordpress
MySQL User: vagrant
MySQL Password: vagrant

NetBeans 8.0

NetBeans 8.0 has been installed with the PHP and HTML 5 support packages.   NetBeans is a great no-cost IDE that I have found to be very useful for editing structured WordPress PHP code.   Write your code using an object oriented design and add a good amount of phpDoc commenting and NetBeans becomes a coding power tool for WordPress plugin coding.

Firefox

Firefox is installed along with several add-on packs that I use for every day code management and development.  It includes Firebug to look at JavaScript, CSS, and HTML construction.   Selenium IDE is included which allows me to download and execute my web-interface based test scripts for Store Locator Plus.   LastPass is installed to provide unique long-string passwords to all of my web services.

WordPress Development Kit

My command line WordPress Development Kit is installed on the system under the vagrant user home directory in ~/wp-dev-kit. This comes with the basic Grunt build tasks I use to manage my plugins. While you can edit the ~/wp-dev-kit/grunt/plugins.json file and configure this for your site it is recommended that you create a fork of my WordPress Development Kit on the Bitbucket site and work from your forked repository. I would remove the wp-dev-kit directory with rm -rf and clone your forked copy into the same location.

The easiest method for cloning a public repository is to use git clone with the https repository path. If you have a private repository you may want to create a SSH key pair by going to ~/.ssh and running ssh-keygen. The key will need to be added as an authorized SSH key in your Bitbucket account access management list.  Since I will be pushing and pulling content from my various Bitbucket git repositories I will use this method when setting up my clone of the WP Dev Kit Basic Box.

Bitbucket HTTPS Path
Bitbucket HTTPS Path

Similar methods can be employed with Github repositories.

Preparing For Development

These are elements that will eventually go into a provisioner setup for the Vagrant box, assuming that at least one of the Vagrant provisioning services can hand user prompts and communication with third party services.

Create Your SSH Key

This will make it easier to push & pull content from your repositories.

cd ~/.ssh
ssh-keygen
xclip -sel clip < ~/.ssh/id_rsa.pub
Setting A Vagrant SSH Key
Setting A Vagrant SSH Key

Login to your Bitbucket account, go to my account / access management, add the copied SSH key.

Bitbucket Add Vagrant Key
Bitbucket Add Vagrant Key

 Configure Your Git ID

git config –global user.name “Lance Cleveland”
git config –global user.email lance@thisdomain.com

Add SmartGit

I like the GUI interface that SmartGit provides over the git command line and gitk branch rendering.  I find SmartGit to be twice as efficient for my method of work flow over the built-in IDE and command line, so I always install this next and clone my base projects like the WP Dev Kit and my Selenium IDE scripts.   Today I am using the SmartGit 6 beta release as I find the project grouping and new interface design to be a big help in managing my projects.

SmartGit UI
SmartGit UI

I immediately setup SmartGit and clone my Selenium IDE repository so I can complete the next step with a script.

SmartGit Bitbucket Setup
SmartGit Bitbucket Setup

Complete The WordPress Install

Open Firefox and go to http://localhost/

Enter the WordPress local site information for your test site.  I use my Selenium IDE new site install script to handle this for me.

Selenium IDE setting up a  new WordPress install.
Selenium IDE setting up a new WordPress install.

Ready To Go

Now my system is ready to go.   I can start cloning my plugin code repositories into the ./wp-content/plugins directory, edit code with NetBeans, commit changes with SmartGit, and publish to my server or the WordPress plugin directory using my Grunt scripts.

With the current configuration it takes me 15 minutes to pull down a new clone and get the box booted then 5 minutes to install and configure SmartGit, clone my repository, run the WordPress install script, and fetch my first plugin repo.   20 minutes from “nothingness” to writing code.    With a local .box file on my system that time is cut down to about 8 minutes by avoiding the 1.5GB download of the box.

Not bad.

Now on to the code…

Posted on

WordPress Development Kit Plugin Released

Grunt WordPress Dev Kit

The WordPress Development Kit plugin works hand-in-hand with the WordPress Development Kit system to assist in publishing public and private WordPress plugins.   I am using this system of Grunt tasks to manage the free plugins listed in the WordPress Plugin Directory as well as the premium add-on packs.

The WordPress plugin that goes along with this system communicates with the Grunt metadata files to present the latest plugin information on the website.   The plugin is now in charge of keeping the version page updated with the latest production release information.   In the version 0.4.0 release, that was published to the WordPress Plugin Directory today, a file list-and-download user interface is also available.

The 0.4.0 release now has the following base features:

  • List all the WP Dev Kit managed plugins in a formatted HTML output.
  • Filter the plugins listed to show only the prerelease or production listing.
  • Create a formatted list of downloadable files with version information with alt and title hover for file size and slug.
  • Create a detailed listing, including the changelog for plugins managed by the WP Dev Kit that have been published with readme data.

You can see the plugin in use on the Premier Subscription downloads page, the version info page, and some plugin details pages.  This update will help bring clarity to the prerelease and production versions that are available for download for all Premier Members.   Premier Members can select the Products/Download menu item on the top of this page to see the latest list of plugins that are available.

The plugin and the WP Dev Kit Grunt-based system are both available via the my Bitbucket repository as public open-source projects.

 

Screenshots

WordPress Dev Kit 0.4.0 Download List
WordPress Dev Kit 0.4.0 Download List
Posted on

Improving WordPress Plugin Development with Sass

SLP Sass Banner

If you’ve been following along since my WordCamp Atlanta trip this spring you know that I’ve been working on automating my WordPress plugin development and production process.    If you missed it you can read about it in the WordPress Workflow and WordPress Development Kit articles.   Since I needed to patch some basic CSS rules in my Store Locator Plus themes, these are plugin “sub-themes” that style the store locator interface within a page, I decided now was the time to leverage Sass.

Sass Is In The House

It was one of my first sessions at WordCamp Atlanta and I KNEW it was going to be part of my automation process.    Sass is a CSS pre-processor.   Store Locator Plus has its own “theme system”, a sort of plugin sub-theme that lives within the WordPress over-arching site theme.     The SLP themes allow users to tweak the CSS that renders the search form, map, and results of location searches to create in-page layouts that better fit within their WordPress theme layout.

Until this past release it was a very tedious process to update themes or create a new theme.    In the latest release there are some 30-odd SLP theme files.    The problem is that when I find an over-arching CSS issue, like the update to Google Maps images that rendered incorrectly on virtually EVERY WordPress Theme in existence, it was a HUGE PAIN.   I was literally editing 30 files and hoping my cut-and-paste job went well.   Yes, I could have done CSS include statements but that slows things down by making multiple server requests to fetch each included CSS file.   Since the store locator is the most-visited page on many retail sites performance cannot be a secondary consideration.   Sass deals with that issue for me and brings some other benefits with it.

There are PLENTY of articles that describe how to install Sass, so I am not going to get into those details here.  On CentOS it was a simple matter of doing a yum install of ruby and ruby gems and a few other things that are required for Sass to operate.  Google can help you here.

My Sass Lives Here…

For my current Sass setup I am letting NetBeans take care of the pre-compiling for me.    It has a quick setup that, once you have Sass and related ruby gems installed, will automatically regenerate the production css files for you whenever you edit a mixin, include, or base SCSS file.

NetBeans Sass Setup
NetBeans Sass Setup

I combine this with the fact that the assets directory is ignored by the WP Dev Kit publication and build tasks to create a simple production environment for my CSS files.   I store my SCSS files in the ./assets/stylesheets directory for my plugin.   I put any includes or mixin files in a ./assets/stylesheets/include subdirectory.     I configure NetBeans to process any SCSS changes and write out the production CSS files to the plugin /css directory.

The first thing I did was copy over a few of my .css files to the new stylesheets directory and changed the extension to .scss as I prepared to start building my Sass rules.

Includes

I then ripped out the repeated “image fix” rules that existed in EVERY .css file and created a new ./stylesheets/include/_map_fix.scss file.     This _map_fix file would now become part of EVERY css file that goes into production by adding the line @include ‘include/_map_fix” at the top of the SLP theme .scss files.    Why is this better?   In the past, when Google has made changes or WordPress has made changes, I had to edit 30+ files.  Now I can edit one file if a map image rule is changing that has to be propagated to all of the css files.   However, unlike standard CSS includes Sass will preprocess the includes and create a SINGLE CSS file.   That means the production server makes ONE file request instead of two.  It is faster.

SLP Map Fix Include
SLP Map Fix Include

As I reiterated this process I ended up with a half-dozen CSS rules that appear in MOST of my CSS files.    Since all of the rules do not appear in all of my plugin theme files I ended up with a half-dozen separate _this_or_that scss files that could be included in a mix-and-match style to get the right rule set for each theme.     I also created a new _slp_defaults include file that does nothing more than include all of those half-dozen rules.  Nearly half of the current CSS files use all of rules that were “boiled out of” the CSS files.

Store Locator Plus Default Includes
Store Locator Plus Default Includes

Mixins

Along the way I learned about mixins.   At first I was a bit confused as to the difference between include files and mixins.  Both are “pulled in” using similar commands in SCSS, @import for the “include files” and @include for the mixins, but what was the difference?    While you can likely get away with “faking it” and having mixins act like includes they serve different purposes.   I like to think of a mixin as a “short snippet of a CSS rule”.

A common example is a “set the border style mixin”.  In the simplest form it can set the border style with a rule for each of the browser variants.  This rule is not a complete CSS rule but rather a portion of a CSS rule that may do other styling AND set a border.    The mixin includes the -moz and other special rule sets to accomodate each browser.   Rather than clutter up a CSS entry with a bunch of border-radius settings, use a mixin and get something like:

.mystyle {
   @include mixin_border_radius;
   color: blue;
}

That is a very simplistic way of using a mixin. One advantage is that if you decide to change the default border radius settings in all of your CSS files you can edit a single mixin file. However that is not a typical use. Yes, you can create subsets of CSS rules, but it really gets better when you add parameters.

At a higher level a mixin is more than just a “CSS rule snippet”. It becomes more like a custom PHP function. In typical coder fashion, I snarfed this box-shadow mixin somewhere along the way:

// _csa_mixins.scss

@mixin box-shadow( $horiz : .5em , $vert : .5em , $blur : 0px , $spread : 0px , $color : #000000 ){
    -webkit-box-shadow: $horiz $vert $blur $spread $color;
    -moz-box-shadow: $horiz $vert $blur $spread $color;
    box-shadow: $horiz $vert $blur $spread $color;
}
@mixin csa_default_map_tagline {
    color: #B4B4B4;
    text-align: right;
    font-size: 0.7em;
}
@mixin csa_ellipsis {
  overflow: hidden;
  text-overflow: ellipsis;
  white-space: nowrap;
}

Since that rule is part of my default _csa_mixins that I tend to use in multiple places I use it as follows:


@import 'include/_csa_mixins';
@import 'include/_slp_defaults';

//blah blah boring stuff here omitted

// Results : Entries (entire locations)
.results_entry {
    padding: 0.3em;
    @include box-shadow(2px, 4px, 4px, 0px, #DADADA);
    border-bottom: 1px solid #DDDDDD;
    border-right: 1px solid #EEEEEE;
}

Notice how I now call in the include with parameters. This is passed to the mixin and the Sass preprocessor calculates the final rules to put in the CSS file. This makes the mixin very flexible. I can create all sorts of different box shadow rules in my CSS files and have the cross-browser CSS generated for me. No more editing a dozen box shadow entries every time I want to change a shadow offset.

Here is what comes out in the final production CSS when using the above mixin. You can see where the parameters are dropped into the .results_entry CSS rule:

.results_entry {
  padding: 0.3em;
  -webkit-box-shadow: 2px 4px 4px 0px #dadada;
  -moz-box-shadow: 2px 4px 4px 0px #dadada;
  box-shadow: 2px 4px 4px 0px #dadada;
  border-bottom: 1px solid #DDDDDD;
  border-right: 1px solid #EEEEEE; }

This is only a start of my journey with Sass and I’ve barely scratched the surface. However I can already see the benefits that are going to come from using Sass. In fact I already used it to fix a problem with cascading menus where one of the SLP theme files did not contain a rule set. Rather than copy-paste from another theme file that contained the proper rules I only needed to add @import ‘include/_slp_tagalong_defaults’ and the problem was fixed.

Going forward Sass will not only increase my throughput in CSS development but also improve the quality of the final product that reaches the customer.

My first SLP Theme file, Twenty Fourteen 01, that was built using these new Sass files is only a 136 line CSS file with LOTS of whitespace and comments.  When the final processing is finished it has all of the rules necessary to support the current add-on packs and style them nicely for the WordPress Twenty Fourteen theme in all major browsers.

A new SLP Theme: Twenty Fourteen Rev 01
A new SLP Theme: Twenty Fourteen Rev 01

 

Posted on

WordPress Dev Kit : Grunt 0.3.0 and Plugin 0.0.2

Grunt WordPress Dev Kit

More refinements have been made this week to my WordPress Workflow and related WordPress Development Kit.  With new products going into production shortly and some older products coming out with new releases, I realized I needed a more efficient way to publish prerelease copies.  As part of the Premier membership program I am trying to get stable prerelease products in the hands of those Premier members that want them.   Some members like to test new releases or try out new features on their test systems before they come out.    It allows them to plan for future updates and provides an opportunity for feedback and updates before the new version is released.  A win-win for the Premier member and for Charleston Software Associates.

In order to support a formal prerelease and production configuration I realized I needed to be able to track two different versions and release dates separately.   Following the general format presented in other Grunt examples, this meant setting up new sub-sections within the plugins.json pluginMeta structure.   The new format looks something like this:

"wordpress-dev-kit-plugin": {
"production": {
"new_version": "0.0.02",
"last_updated": "2014-04-03"
},
"prerelease": {
"new_version": "0.0.03",
"last_updated": "2014-04-04"
},
"publishto" : "myserver"
}

Changing the structure meant updating both the Gruntfile.js for the development kit as well as the parser that is in the WordPress Development Kit plugin. The changes were relatively minor to address this particular issue, but I did learn some other things along the way.

Tasks and Targets

In my own Grunt tasks I had been calling one of my parameters in my build sequence the “type”, as in the “build type”. However the configuration file examples online often talk about a “target”. A target would be something like “production” or “prerelease” that shows up in a configuration block like this one:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
path: ‘<%= myServer.path %><%= grunt.task.current.target %>/’,
srcBasePath: "../public/<%= grunt.task.current.target %>/",
showProgress: true
},
production: { expand: true, cwd: "../public/<%= grunt.task.current.target %>/", src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"] },
prerelease: { expand: true, cwd: "../public/<%= grunt.task.current.target %>/", src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"] }
},

I have updated my scripts and documentation terminology to refer to this parameter as the “target” to follow convention.

Simplify Congiguration With grunt.task.current.target

I learned a new trick that helps condense my task configuration options. In one of my interim builds of the WordPress Dev Kit I had something that looked more like this:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
showProgress: true
},
production: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %>production/’,
srcBasePath: "../public/production/",
},
prerelease: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %>prerelease/’,
srcBasePath: "../public/prerelease/",
},
},

A bit repetitive, right? I found you can use the variable grunt.task.current.target to drop the current task name as a string into a configuration directive:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
showProgress: true
},
production: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %><%= grunk.task.current.target %>/’,
srcBasePath: "../public/<%= grunk.task.current.target %>/",
},
prerelease: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %><%= grunk.task.current.target %>/’,
srcBasePath: "../public/<%= grunk.task.current.target %>/",
},
},

Now that the prerelease and production path and scrBasePath variables are identical they can be moved into the top options section.

Now if I can just figure out how to have shared FILE configurations which define the current working directory (cwd), source (src), and destination (dest) file sets I could eliminate ALL of the settings in the production and prerelease configuration blocks and leave them with a simple “use the defaults setup like this:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
path: ‘<%= myServer.path %><%= grunk.task.current.target %>/’,
srcBasePath: "../public/<%= grunk.task.current.target %>/",
showProgress: true
},
production: { },
prerelease: { },
},

Maybe someday.

Simplify Congiguration With Shared Variables

Another trick I learned is that common configuration strings can be put at the top of the configuration block and re-used. This is alluded to on the Grunt tasks configuration page but they never expound on how to use the common configuration variables. Here is how it works, define the variable at the top of the configuration block then reference that variable inside a string like ‘<%= my_var %>’. Here is my example with some “fluff” missing from the middle:

// Project configuration.
grunt.initConfig({

// Metadata.
currentPlugin: currentPlugin,
myServer: myServer,
pkg: grunt.file.readJSON(‘package.json’),
my_plugin_dir: ‘/var/www/wpslp/wp-content/plugins/<%= currentPlugin.slug %>’,
my_src_files: [
‘**’,
‘!**.neon’,
‘!**.md’,
‘!assets/**’,
‘!nbproject/**’
],
// compress
//
compress: {
options: {
mode: ‘zip’,
archive: ‘../public/<%= grunt.task.current.target %>/<%= currentPlugin.zipbase %>.zip’,
},
prerelease: { expand: true, cwd: ‘<%= my_plugin_dir %>’, src: ‘<%= my_src_files %>’ },
production: { expand: true, cwd: ‘<%= my_plugin_dir %>’, src: ‘<%= my_src_files %>’ },
},

In this example you can see how I’m using the my_plugin_dir variable to set my path to the plugins I am working on on my dev box and my_src_files to list the files I want to add (or ignore) when pushing my development plugin directories to a zip file or the WordPress svn repo for publication.

This has simplified a lot of task configuration entries in my custom grunt tasks script.

That combined with smarter configuration blocks in areas like the SFTP node module has simplified my Grunt configuration which will make it less prone to errors and easier to maintain going forward.

Back to coding…

Posted on

WordPress Dev Kit Plugin : 0.0.1

Grunt WordPress Dev Kit

For those of you that have been following along with my exploration of Grunt and automating my plugin workflow automation… I’m sorry.   Quite boring I’m sure but remember this blog is my personal notebook as much as fodder for the 3 people that may be interested in this stuff.

Last night I extended my journey toward less manual updates, each new step adding an option for human error, by creating the first WordPress Development Kit Plugin.  Yup, a plugin for plugin development.  Sort of.  What the new plugin is going to help me with is keeping my Plugin Version Info page updated.   Today it is very rudimentary with a basic list of plugin slugs, the version info, and release dates.  You can see it in action on my Plugin Version Info page. Ultimately the new Plugin companion to the WordPress Development Kit will be extended to get more information from the Grunt process into the public website via the automated toolkit.

My goal is to have ONE PLACE where plugin information is updated, preferably in the readme files.    For now the JSON file that drive Grunt will suffice with future plans to scrape the readme data into the plugins.

What WordPress Dev Kit Plugin Does

The WordPress Dev Kit plugin is very simplistic in its current form.  It reads the plugins.json file from the WP Dev Kit and renders information via a shortcode to a plugin or  post.

Version 0.0.3 of the plugin has the following shortcodes available:
* Actions (default: list)
* o [wpdevkit action='list'] list details about all plugins
*
* Styles (default: formatted)
* o [wpdevkit action='list' style='formatted'] list the details in an HTML formatted layout
* o [wpdevkit action='list' style='raw'] list the details in a print_r raw format
*
* Types (default: basic)
* o [wpdevkit action='list' type='basic'] list basic details = version, updated, directory, wp versions
* o [wpdevkit action='list' type='detailed'] list all details = version, updated, directory, wp versions, description
*
* Slug (default: none = list ALL)
* o [wpdevkit action='list' slug='wordpress-dev-kit-plugin'] list details about a specific plugin

This will list the entire plugin metadata structure in a pre tag as a standard PHP dump format.

WP Dev Kit Plugin Setup

It expects that you have the WordPress Development Kit plugins.json file pushed to a production files directory on your server.   You can set the location of the file to any web-server-readable directory.    The location is specified in the Settings / WP Dev Kit menu in the admin panel.

If you are using the WP Dev Kit Grunt tasks the build:<slug>:production process will move the plugins.json file over to your server along with your production.zip files.

My Process And How This Helps

In case you’re wondering why I would go through the trouble of building a full-fledged plugin like this, here is my typical workflow:

  1. Edit code.
  2. Edit readme to update version, features, etc.
  3. Edit main plugin to update version, etc.
  4. Create zip file.
  5. FTP zip file to server.
  6. If a WordPress Plugin Directory plugin, fetch SVN repo, update trunk, commit, add branch, commit, push.
  7. Login to my website.
  8. Update version info page on website.
  9. Create blog post about new plugin update with features and screen shots if warranted.
  10. If a Store Locator Plus plugin update the HTML that is pushed through the in-product signage app.

Until today nearly every step of that process was manual.  Now with the WP Dev Kit running on my system and the WP Dev Kit Plugin on my public site the process is now:

  1. Edit code.
  2. Edit readme to update version, features, etc.
  3. Edit main plugin to update version, etc.
  4. Edit the WP Dev Kit JSON file.
  5. grunt build:<slug>:production  which automatically
    1. checks that the version in the readme.txt and plugin file match (new quality control test)
    2. creates the zip file
    3. uses SFTP to put the file on my server
    4. updates the WordPress Plugin Directory Listings (fetch, update trunk, commit, add branch, commit, push)
    5. pushes plugin.json which talks to the WP Dev Kit Plugin and keeps the version info page upated
  6. Login to my website.
  7. Create blog post about new plugin update with features and screen shots if warranted.
  8. If a Store Locator Plus plugin update the HTML that is pushed through the in-product signage app.

The magic happens in steps 4 and 5.   It automates many of the steps in the process.  As I refine the WordPress Dev Kit I will be able to eliminate more steps along the way.

Not a bad start.   As each new plugin update happens I will be refining and improving the automation files and plugin to create a better presentation and improve quality control at each step.

Learn about this process via the WordPress Development Kit articles.

Posted on

WordPress Dev Kit : Grunt Helpers 0.2.0

Grunt WordPress Dev Kit

I am continuing on my quest to use Grunt to further automate my plugin development process. I hope to be generating better quality plugins through more automated testing and sanity checks during the production process. I also hope to take out a few of the steps required to get the plugins into the hands of my customer base. TheWordPress Development Kit articles describe the process as it evolves, both to possibly help others that are considering automating with Grunt and to help people that are starting to work on plugins related to those I’ve created understand my process.

My Environment For Grunt Helpers 0.2.0

I’ve made some changes to my plugin production environment since the original “Legacy Publisher” and even my “Grunt Helpers 0.1.0” article.   Some of the changes are based on things I’ve seen elsewhere as I learn about Grunt and other changes are to have a better defined environment.   Here is a summary of how things are working with the Grunt automation in the latest iteration.

I have two types of production, ready for public consumption, plugins:

WordPress Plugin Directory Listings

The WordPress Plugin Directory Listings (WordPress Hosted) are managed via the standard subversion (svn) repository update process.   They also are packages and put on my server for customers to download from my site.

Premium Plugins Served From My Site

The Premium Plugins (Premium) are only hosted on my server and made available to customers that have purchased the premium add-on packs.

For both types of plugins, whether WordPress Hosted or Premium, I now have two versions available; the production release (production) which is the official ready-for-deployment version and pre-release (prerelease) which is provided to select customers for testing an upcoming release before it has been fully tested.   Pre-release versions are beta releases that may not be fully tested.

Regardless of whether a plugin is a WordPress Hosted product or a Premium product, the pre-release is ONLY available from my server and only to select customers.    I use the Groups plugin combined with the WooCommerce add-on pack for Groups as well as the File Away plugin to manage access to the pre-release versions.

The Grunt Tasks

To support the various iterations of the plugins that are being produced I have created several grunt tasks that are managed by my WordPress Development Kit scripts.

grunt build:<slug>:production

This task will publish the production-ready copy of my plugin.   It goes through my development copy of the plugin directory, cleans out the “development cruft” and builds a .zip file that is ready for distribution.    It will then read the plugins.json file and determine the ultimate home for the production copy.  If the “publishto” property for the given slug is “wordpress” (maybe I should change it to “WordPress”) the product ends up on the WordPress Plugin Directory.    If the publishto property is “myserver” the product ends up on my web server in the production files directory.

grunt build:<slug>:prerelease

This tasks also cleans out the “development cruft” of the plugin directory and creates a zip file.   However in this mode the plugin zip file only ends up on my server regardless of the publishto property set in the plugins.json file.   The prerelease files are stored in a separate directory on my server from the production files.

My Grunt Configuration Files

There are now several files that are used to configure my development kit.

package.json

This now follows a standard Grunt package.json format.   It contains my author info, the grunt default project name, version, description, license, and a list of grunt dependencies.   Grunt dependencies are node.js modules that are used by this project.  Currently the list includes:

Those with the asterisk (*) are not currently used, but I know I will be making use of them in the future so I’ve left those recommended default modules in place.

plugins.json

This is now the home for all of the metadata about my WordPress plugins that are being managed by the WordPress Development Kit.  This includes variables that are later used by modules such as wp-deploy as well as my plugin specific information such as the slug and other information.   The format is typical JSON with the following nodes defined:

    • pluginMeta = an array of plugin slugs
      • <slug>
        • version = current production version of the plugin (most likely will be deprecated and auto-read from readme.txt)
        • name = the plugin name (most likely will be deprecated and auto-read from readme.txt)
        • description = the plugin description (most likely will be deprecated and auto-read from readme.txt)
        • publishto = where production files get published, either “wordpress” (WordPress Hosted) or “myserver” (Premium)
        • zipbase = the base name of the zip file, used if I want to create myplugin-test.zip instead of myplugin.zip
        • reposlug = the WordPress Hosted repository slug if not the same as <slug>
    • wp-plugin-dir = where on this server, my development server, do the in-development plugins live
    • wp-username = my WordPress Plugin Directory username

myserver.json

This file contains the details about my server that help with the SFTP push to my server.  It includes things like my local path to the SSH key files and where I want to put stuff on the remote server.

    • host = my fully qualified host name (no http, etc. just a pure host name)
    • username = the username to use for sftp login
    • privateKeyFile = the path on my development system where my private key file lives
    • privateKeyPassFile = the path on my development system where the private key password is stored (will become a command line option, this is a security risk, though my dev system is fairly well locked down and isolated)
    • path = the path on the production server that is the root directory under which my production and pre-release files will go.

Process Details

Here is what happens when I execute the two most-used commands in my new build kit.

WordPress Hosted Production

command: grunt build:<slug>:production

This process gathers all the files in my plugin directory and copies the “cleaned” version of the files into ./assets/build under the plugin directory itself.  I store it here because assets is one of the directories that is ignored (cleaned) during production and it allows me to see exactly what got moved into production.    These files are then published to the WordPress svn repository using the wp-deploy module.   The same files are sent through compress with a .zip file created and stored in the wp-dev-kit/public directory (yes this seems redundant and can probably be simplified).  Finally the zip file from that public directory is sent over to my live server with SFTP and put into the production folder on the live server.

[box type=”alert”]I did have to create a patch for wp-deploy to properly manage subdirectories. I have submitted the patch to the author. Hopefully it makes it into his production release soon.[/box]

Premium Production

command: grunt build:<slug>:production

This process is identical to the WordPress Hosted Production process with one exception.  It skips the wp-deploy call and does nothing with svn.  I creates the ./assets/build files, zips them, puts them in the WP Dev Kit public folder, and copies them over to the production folder on my server.

WordPress Hosted and Premium Prerelease

command: grunt build:<slug>:production

This process works much like the Premium Production process.  The only difference in the process is that the files are copied over to my server into a different directory that holds pre-release plugin files only.

Further Refinements

There are a number of refinements to be made to the process.  First on my agenda is reading the readme.txt file and extracting the version information so it can be appended to the zip file names for pre-release copies of the products.    I will then work on things like running the third party modules to do gettext sanity tests, JavaScript minification, and other “would be nice” features for production plugins.   I will also likely change this process a dozen times as I continue to iterate over both Premium and WordPress Hosted plugin builds over the next few months.

In the end I hope to have a handful of simple Grunt commands that manage the process efficiently and reduces the human error that my current process can introduce at several steps.

If you have suggestions or feedback on how to improve my process, please let me know!

Related Topics

Posted on

What I Learned About Grunt Compress

Grunt Configuring Tasks Banner

I learned a few interesting things while setting up my WordPress Development Kit project with some new Grunt tasks.   These are my findings while getting some custom build tasks working with the Grunt compress plugin (grunt-contrib-compress).

You can setup the default compress mode to be zip by using the main.options attribute in the JSON config parameters.

You can tell Compress where to put the final zip file and what to name it using dynamic elements, such as the plugin slug as the base file name by using an anonymous function in the archive property.

Use the files property to set an array of files to be created.   In each element of the array you specify attributes such as what files are to go INTO the zip file, where they are stored in the zip file, whether or not to use relative paths and more.    This is where things were a bit confusing for me so I’ll expand on that in a moment.

Here is my starting Grunt Compress settings that I am working with.  I will explain what this does below:

    compress: {
        main: {
            options: {
                mode: 'zip',
                archive: function() {
                    return slug + '.zip';
                }
            },
            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/',
                    src: ['*'],
                    dest: 'public/',
                }
            ]
        },
    },

Here is a snippet of code that goes with the above configuration to do something.

  /**
   * makezip
   *
   * Build the zip file for the plugin, avoiding the assets directory, .git folders, and .neon files.
   */
  grunt.registerTask('makezip', 'Make a zip file for the plugin.', function(slug){
      grunt.log.writeln('Placeholder for makezip.  Slug name is ' + slug);
      global.slug = slug;
      grunt.task.run('compress');
  });

What Is In The Above Configuration

One of the first important things to note is that Grunt has a fairly robust built-in file manager. This file manager is available to all tasks and allows task files to use a default set of file rules such as the cwd, expand, src, and dest properties you see in the configuration section above. The Files section of the Grunt Configuring Tasks page will provide more insight beyond what I describe below.

archive

In the example above this is an anonymous function. The global variable “slug” is set in the makezip task and this is used to create the final zip file name. In my case it will be the WP Plugin Slug and .zip such as store-locator-le.zip for my Store Locator Plus plugin.

files.expand

The expand property tells the Grunt file processor to do dynamic source-and-destination processing.

files.cwd

Instructs the current processor, Compress in this case, to strip out the fully qualified path and make all file names in the processing loop relative to the parameter in the cwd command.  In my case it will make all files relative to my WordPress plugin root directory /var/www/wpslp/wp-content/plugins/.

file.src

This tells Compress which files are to be included in this round of processing.  For Compress it is the files that will be included in the .zip file distribution.   It uses the rules of something called minimatch as a file pattern matching system.  minimatch will grab as FEW files as possible so the ‘*’ rule here works different that typical operating-system wildcard file listings.   It will ONLY match the files in the exact directory specified.    In my case only the FILES (no directories or subdirectories) that are in my wp-content/plugins directory which in my case is only grabbing my legacy publisher scripts I put in the WP plugins directory on my dev box (blah, what a bad design).    I will explain how I fix this later.

file.dest

This one kind of threw me.   You can see I put public/ in as my destination.   I THOUGHT it would put the resulting .zip file in a folder named public under my current Grunt working directory with a <slug>.zip file in it.   WRONG.   What this does is tells compress where inside the resulting zip file you want the files it “gathers” with the file.src pattern noted above.

In the setup above it created a file in the ROOT grunt directory named store-locator-le.zip.   Inside that zip file is a folder named “public” in which all the contents of my WP Plugin directory (base files only) reside.  NOT what I wanted!

My Grunt Compress Mess
My Grunt Compress Mess

Fixing The Initial Grunt Problems

The first thing to fix is getting the .zip file to go to the ./wp-dev-kit/public folder where I will fetch it with other tasks for publication to the WordPress public plugin directory or to my server for private access for premium add-on packs.   There are two items to fix: files.dest and the archive path.

Removing the dest: property from my files section solved the first issue.   Now the files that match the src specification will go into the top-level of the .zip that is created.

Adding ../public/ to the start of my anonymous archive function will store the files in my public folder which resides next to the running Grunt tasks folder.

First two relatively minor issues are fixed, but there are deeper issues to resolve, specifically getting the files in the specified plugin directory and then adding some methods to ignore the files I don’t want part of the kit.

    compress: {
        main: {
            options: {
                mode: 'zip',
                archive: function() {
                    return '../public/' + slug + '.zip';
                }
            },
            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/',
                    src: ['*'],
                }
            ]
        },
    },

Step 2 – Only Get The Specified Plugin

Getting the specified plugin directory wasn’t difficult, but it did involve a bit of Google and learning about named variable configuration in Grunt and how to get them into my “variable space” so I can use the <%= varname %> syntax in my Compress settings.    Luckily Chris Wren wrote a nice Grunt related article for newbs such as myself.

First step, add the variable declaration at the top of the Gruntfile, right above grunt.initConfig and inside the module.exports.

Grunt currentPlugin "global"
Grunt currentPlugin “global”

With that in place I can now tell the Compress plugin to make all file processing relative the plugin slug directory and use the same variable to set my zip file base name:

    compress: {
        main: {
            options: {
                mode: 'zip',
                archive: function() {
                    return '../public/' + currentPlugin.slug + '.zip';
                }
            },
            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/<%= currentPlugin.slug %>',
                    src: ['**'],
                }
            ]
        },
    },

Inside my makezip tasks I now set the currentPlugin variable properties as opposed to a generic global variable, which is what I really wanted to do in the first place:

  /**
   * makezip
   *
   * Build the zip file for the plugin, avoiding the assets directory, .git folders, and .neon files.
   */
  grunt.registerTask('makezip', 'Make a zip file for the plugin.', function(slug){
      grunt.log.writeln('Placeholder for makezip.  Slug name is ' + slug);
      currentPlugin.slug = slug;
      grunt.task.run('compress');
  });

Now I am only getting the files for the specified plugin and not the WordPress plugin directory root files.   While I’m in there I also chanced the src parameter to ** versus *.   ** will grab all files in the current directory and any sub-directories that are part of my plugin.

Excluding Folders and Files

The last step will be to filter out those files I don’t want to include per my “no assets directory”, “no .git”, “no nbproject” and no “apigen.neon” files or folders.   If you follow my WordPress work flow posts you’ll know that this is my development environment and I prefer to work “inline” with the plugin directory and clear out the “cruft” of development files in the production cycle.

Thankfully the Grunt file processor makes excluding files a simple task.   I extend the src property with some “do not include” settings like so:

            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/<%= currentPlugin.slug %>',
                    src: [
                        '**',
                        '!**.neon',
                        '!**.md',
                        '!assets/**',
                        '!nbproject/**'
                    ],
                }
            ]

That source specification will get all files in the main and sub-directories of my plugin EXCEPT for anything ending in .neon, .md or anything in the assets or nbproject sub-directories. By default the file filter will ignore any files starting with a dot, such as my .git, .gitignore, and .gitmodules folders and files.

Sweet!

I’m already liking Grunt WAY more than writing Bash files!

Follow along with other blog posts about the WordPress Workflow and WordPress Development Kit.

Posted on

Preparing A WordPress Plugin for Grunt Automation

Grunt Getting Started Banner

Thanks to my experience at WordCamp Atlanta (#wcatl), I’ve spent the past couple of days learning about Vagrant and how I can use it to build and distribute new VirtualBox systems to my developer-team-in-training.   I will refine that process to get new members of the development team setup with a CSA Standard environment to bring them up to speed with my process flow with less effort.

Today I am starting with another project inspired by my trip to Atlanta, using Grunt to automate my plugin building experience.   In this article I will go through my setup and initial project automation experience as I learn more about what Grunt can do for me,  getting it setup, and how I can use it to automate at least one of  the many steps involved in the final production phase of a WordPress plugin.

My Environment

My WordPress plugin development environment is fully contained within a virtual Linux workstation running on a laptop.   My current setup:

  • CentOS 6.5 with a full GUI desktop
  • Apache 2.x
  • PHP 5.4.23
  • MySQL 5.5.35
  • NetBeans 8.0 RC1
  • SmartGit 3.0.11 (on top of git 1.7.1)
  • WordPress 3.8.1 (soon to be update to latest nightly build for 3.9)
  • Firefox
  • Selenium IDE

My Process

The basic outline of a premium add-on pack production cycle follows a basic routine:

  • Code Cycle
    • Edit code in NetBeans writing directly to the local wp-content/plugins directory.
    • Commit changes via SmartGit to the current development branch.
    • Push changes with SmartGit to BitBucket when ready.
  • Test Cycle
    • sudo mysql < wpreset.sql to reset the WordPress database (blasts all tables)
    • start Firefox and open Selenium IDE
    • run the New WP Install Selenium IDE script
    • run some of the base Store Locator Plus data test scripts
    • for add-on packs run the add-on pack specific test scripts
    • Edit/Commit/Push/Repeat
  • Production Cycle
    • Validate the readme.txt file.
    • Publish a blog post about the update and new features.
    • Edit the CSA Server Update System with new version/date information.
    • Edit the CSA Version Information page.
    • Update the News and Info “signage”.
    • Package the plugin .zip file.
    • Publish the .zip file to the CSA servers.
    • If it is a WordPress Plugin Directory listing, update the svn repo to publish to WordPress.

As you can imagine, there are a lot of steps in the final production cycle that can be automated.    I will also be exploring phpUnit testing for my plugins to provide deeper testing of the plugins that can be automated to be a virtually hands-off test system, but that is a project for later.  For now, I need to learn Grunt and start replacing my useful but less-flexible bash scripts to simplify the final Production Cycle.

Installing Grunt

One of the first things I learned about Grunt is that it runs on node.js and I need the Node Package Manager (npm) to get it working.   On CentOS this is fairly easy.    I open a terminal, execute sudo, and install npm.    It brings the rest of the Node.JS stuff with it as dependencies.   When that is completed you can install grunt-cli and grunt-init via npm. Apparently I am going to want something called “grunt init templates” as part of the Grunt Scaffolding setup, so I will also use git to clone one of the official “bare bones” templates into my Linux user home directory.

NPM is part of the Extra Packages for Enterprise Linux (epel) repository.   You will need to install this before the install npm command will work.   If you are using a stock CentOS 6.5 release you can go to the following URL and click on the package link, open the download with package installer, and the epel yum repository will be installed and activated:

http://mirrors.severcentral.net/fedora/epel/6/i386/repoview/epel-release.html

$ sudo yum install npm
$ sudo npm install -g grunt-cli
$ sudo npm install -g grunt-init
$ git clone https://github.com/gruntjs/grunt-init-gruntfile.git ~/.grunt-init/gruntfile

With my initial install on CentOS 6.5 there were multiple “unmet dependency” errors followed by a “but will load” message. Running the command grunt seems to be pulling up the grunt CLI. For now I am going to assume those errors are not important for getting my first simplified Grunt tasks running.

Grunt CLI Install Warnings
Grunt CLI install warnings.

Adding Grunt To My Plugin Project

For years I’ve been using a series of Bash scripts to help manage my distribution.   One of the scripts that I use in every production cycle is a script named makezip.sh.    This script packages up the plugin subdirectory I specify skipping various files and directories (.git, the assets subdirectory and a few others) and creates a .zip file in a directory “far away from” the running WordPress install.  I can opt to send copies to the live server when they are ready for publication or keep them local for manual distribution and/or testing.   I bring this up because it impacts my first Grunt setup on my Enhanced Results premium add-on for Store Locator Plus.

I already use the assets sub-directory within the wp-content/<plugin> directory to store all of my development and production scripts and related assets.    As such I already have a place, the assets sub-directory within the plugin directory, where I should be able to store my Grunt configuration and script files without impacting the plugin distribution.

[box type=”note” style=”rounded”]My Dev Environment: The ./assets directory under the current plugin directory is NOT distributed during production.[/box]

To get started I go to my plugin sub-directory on my development system and create a grunt folder and the starting assets via the grunt-init template loaded with git clone as noted above.  After getting the template installed I run npm init to fetch the “helpers” for Grunt.  The helpers are node modules, aka Grunt plugins, that will be referenced by the Gruntfile.js execution.

cd ./wp-content/slp-enhanced-results
mkdir assets
cd assets
echo '<!--?php // Silence is golden.' --> index.php
mkdir grunt
cd grunt
echo '<!--?php // Silence is golden.' --> index.php
# get a basic package.json and Grunfile.js in place
grunt-init gruntfile
# set some defaults in package.json
npm init
# installs the modules specified in the package.json
npm install

[box type=”note” style=”rounded”]What are those echo commands? They create an index.php file to prevent browsing of the directories from a web interface. They are there as an extra safety measure in case the assets directory gets published.[/box]

With the gruntfile template I tell it that I am not using the DOM but will want to concatenate and minify files and that I will be using a package.json file at some point.

Grunt Gruntfile Template Setup
Grunt gruntfile template setup for my add-on pack.

Running this command puts the Gruntfile.js and package.json files in my ./assets/grunt folder and gives me access to the basic scripting tools necessary to start a grunt project.

I think I’m ready for some automation!

Gruntfile Defaults

Earlier I ran the the grunt-init gruntfile step to setup a default starter environment for Grunt.   Time to dig into the details of the two install files.   Some basic reading tells me that the package.json file tells Grunt which “helpers” are to available to this project by default including their version numbers:

Grunt “Helpers”

Helpers are officially termed “plugins” in the Grunt world.   I call them helpers at this stage to remind me that they help perform tasks within my project but I’ll still need to guide them as to what to do.

The default “helpers” in package.json:

{
  "engines": {
    "node": ">= 0.10.0"
  },
  "devDependencies": {
    "grunt": "~0.4.2",
    "grunt-contrib-jshint": "~0.7.2",
    "grunt-contrib-watch": "~0.5.3",
    "grunt-contrib-nodeunit": "~0.2.2",
    "grunt-contrib-concat": "~0.3.0",
    "grunt-contrib-uglify": "~0.2.7"
  }
}

What are these helpers? The first couple of entries are obvious. The base JavaScript engine is node and the first “helper” is grunt. What are the rest?   They are all plugins from the grunt-contrib library which gives us some hints:

  • grunt-contrib-jshint

    Validate files with JSHint.   JSHint looks for issues in the syntax of JavaScript files.  With Grunt this happens BEFORE they are published if you keep this as a default task.

  • grunt-contrib-watch

    Run tasks whenever watched files change.  This watches files in your projects.  If a file changes, do something.

  • grunt-contrib-nodeunit

    Run Nodeunit unit tests. Allows node unit tests to be added to your project and run during a build cycle with Grunt.

  • grunt-contrib-concat

    Concatenate files.  Grab a list of files and concatenate them.

  • grunt-contrib-uglify

    Minify files with UglifyJS.  Create minimized JavaScript files from your source files.  Speeds up page load times by cutting out all of the non-executable parts of a JavaScript file including white space.

Grunt Commands and Execution

The other file that is created with the default template that I’ve used id the Gruntfile, stored as Gruntfile.js.   That is a pretty good hint that it is a JavaScript file.   Here is what it looks like:

/*global module:false*/
module.exports = function(grunt) {

  // Project configuration.
  grunt.initConfig({
    // Metadata.
    pkg: grunt.file.readJSON('package.json'),
    banner: '/*! <%= pkg.title || pkg.name %> - v<%= pkg.version %> - ' +
      '<%= grunt.template.today("yyyy-mm-dd") %>\n' +
      '<%= pkg.homepage ? "* " + pkg.homepage + "\\n" : "" %>' +
      '* Copyright (c) <%= grunt.template.today("yyyy") %> <%= pkg.author.name %>;' +
      ' Licensed <%= _.pluck(pkg.licenses, "type").join(", ") %> */\n',
    // Task configuration.
    concat: {
      options: {
        banner: '<%= banner %>',
        stripBanners: true
      },
      dist: {
        src: ['lib/<%= pkg.name %>.js'],
        dest: 'dist/<%= pkg.name %>.js'
      }
    },
    uglify: {
      options: {
        banner: '<%= banner %>'
      },
      dist: {
        src: '<%= concat.dist.dest %>',
        dest: 'dist/<%= pkg.name %>.min.js'
      }
    },
    jshint: {
      options: {
        curly: true,
        eqeqeq: true,
        immed: true,
        latedef: true,
        newcap: true,
        noarg: true,
        sub: true,
        undef: true,
        unused: true,
        boss: true,
        eqnull: true,
        globals: {}
      },
      gruntfile: {
        src: 'Gruntfile.js'
      },
      lib_test: {
        src: ['lib/**/*.js', 'test/**/*.js']
      }
    },
    nodeunit: {
      files: ['test/**/*_test.js']
    },
    watch: {
      gruntfile: {
        files: '<%= jshint.gruntfile.src %>',
        tasks: ['jshint:gruntfile']
      },
      lib_test: {
        files: '<%= jshint.lib_test.src %>',
        tasks: ['jshint:lib_test', 'nodeunit']
      }
    }
  });

  // These plugins provide necessary tasks.
  grunt.loadNpmTasks('grunt-contrib-concat');
  grunt.loadNpmTasks('grunt-contrib-uglify');
  grunt.loadNpmTasks('grunt-contrib-nodeunit');
  grunt.loadNpmTasks('grunt-contrib-jshint');
  grunt.loadNpmTasks('grunt-contrib-watch');

  // Default task.
  grunt.registerTask('default', ['jshint', 'nodeunit', 'concat', 'uglify']);

};

After sitting in on the Grunt session at WordCamp I know a couple of things about this file. The initConfig section sets the rules for the various “helpers” and way down near the bottom is a Default task. This is what does all of the work when I run Grunt in my project.

What is this going to do?

The default I have now is going to run jshint, which is going to look for any JavaScript files and scan them for syntax issues and other problems like unused variables (I can tell that by looking at the jshint section higher up in the code). It will then run any nodeunit tests by looking in test/**/ for any files ending in _test.js and execute them (I assume). It will concat any files that live in lib/ that end with .js and store them in dist/<pkg.name>.js.   Finally it will uglify… minify… any of the files that are stored in the destination directory defined by the concat section (dist/) and minify them.

Tweaking Gruntfile For Me

Looks like some decent defaults, but for my project I need to change some things.

I won’t run Node unit testing on this project.  In the Gruntfile I remove the nodeunit from the default tasks section and the config section above.   I also remove the package from the json file.

As noted above, the Grunt project directory on my setup is under the assets subdirectory for this plugin.   All of my main project files are up a couple of levels in the plugin parent directory.   Since I want to distribute both the original JS files AND the minified version, I need to change some paths.   I am going to concat and uglify the scripts in the main plugin distribution directories and output the minified versions there.   I need to change any paths in the upper “config part” of the Gruntfile.js:

Update the concat section:

    // Task configuration.
    concat: {
      options: {
        banner: '<%= banner %>',
        stripBanners: true
      },
      dist: {
        src: ['../../js/<%= pkg.name %>.js'],
        dest: '../../js/<%= pkg.name %>.concat.js'
      }
    },

And the JSHint section:

    jshint: {
      options: {
        curly: true,
        eqeqeq: true,
        immed: true,
        latedef: true,
        newcap: true,
        noarg: true,
        sub: true,
        undef: true,
        unused: true,
        boss: true,
        eqnull: true,
        globals: {}
      },
      gruntfile: {
        src: 'Gruntfile.js'
      },
      lib_test: {
        src: ['../../js/*.js', '../../js/test/**/*.js']
      }
    },

That will ensure that all JavaScript stuff goes in my standard ./js subdirectory in my plugin. Yes, the users will get those files making for a larger zip file download and more disk space on their server. Disk space is cheap and download speeds are decent in most places. Not too mention zip is pretty darn good at compressing JavaScript files. When my plugin executes it will load the concatenated minified file which gives the full benefits of execution speed and reduced RAM footprints on the server and users browser. This keeps the original source available to my user base so they can read the code and hack functionality if they find the need without wading through obsfucated minified concatenated JavaScript.

I’ve also learned that I need to add more details to the package.json file in order to get the default rules and tools to work.  This includes adding the name, version, and author variables to package.json.   If you do not define all 3, INCLUDING AUTHOR, you will get the following error:

Running "concat:dist" (concat) task
Warning: An error occurred while processing a template (Cannot read property 'name' of undefined). Use --force to continue.

Adding the name, version, and author elements to the default package.json results in:

{
  "name": "slp-enhanced-results",
  "version": "4.1.01",
  "author": "csa",
  "engines": {
    "node": ">= 0.10.0"
  },
  "devDependencies": {
    "grunt": "~0.4.2",
    "grunt-contrib-jshint": "~0.7.2",
    "grunt-contrib-watch": "~0.5.3",
    "grunt-contrib-concat": "~0.3.0",
    "grunt-contrib-uglify": "~0.2.7"
  }
}

Now I an run the grunt command which will execute jshint, concat, and uglify on all my JavaScript files. Currently the Grunt configuration outputs some headers so I know it is thinking about doing something, but I don’t have anything interesting to process yet. But I will soon. That will be content for the next article about automating my WordPress workflow.

First Grunt Run - No Errors
First Grunt Run without errors.
Posted on

Creating A CentOS GUI Vagrant Base Box

CentOS 6.5 Vagrant Login Banner

While playing with PuPHPet and Vagrant I realized my needs are specific enough to warrant building my own Vagrant Base Box.    My process is outlined below.

Setup VirtualBox Hardware

Start VirtualBox and build a new guest “hardware” profile:

  • Base Memory: 2048MB
  • Processors: 2
  • Boot Order: CD/DVD , Hard Disk
  • Acceleration: VT-x/AMD-V , Nested Paging , PAE/NX
  • Display: 32MB Video Memory , 3D Acceleration
  • Network: Intel PRO/1000 MT Desktop (NAT)
  • Drive: SATA with 20GB pre-allocated fixed disk
  • CD/DVD : IDE Secondary Master Empty
  • No USB, Audio, or Shared Folders
VirtualBox CentOS 6.5 GUI Base Box
VirtualBox CentOS 6.5 GUI Base Box

Base Box “Unbest” Practice

These base settings do not fall within the Vagrant Base Box best practices, however I need something a bit different than the typical Vagrant box configuration which is why I am building my own.   I build my boxes with a full GUI which enables me to spin up the virtual environment, login to the GUI, and have my entire development environment in a self-contained portable setting.    There are “lightweight” ways to accomplish this but I do have my reasons for building out my WordPress development environment this way which has been outlined in previous posts.

Adding the Operating System

Now that I have the base box setup it is time to layer on the CentOS 6.5 operating system.   I setup my box for the English language with a time zone of New York (United States EST, UTC+5), no kernel dump abilities, full drive allocated to the operating system.     It is built as a “Desktop” server which gives me the full GUI login which makes it easier to setup my GUI dev environment further on down the road.  It does add some GUI apps I don’t need very often but it is nice to have things like a simple GUI text editor and GUI system management tools for the rare cases when I want them and am too lazy to jump out to my host box to do the work.

Per Vagrant standards the box profile is setup with the root password of “vagrant” and with a base user for daily use with an username and password also set to “vagrant”.

After a couple of reboots the system is ready for a GUI login, but not quite ready for full production.

CentOS 6.5 Login Screen
CentOS 6.5 Login Screen

Adding VirtualBox Guest Additions

One of the first things to do with a VirtualBox install running a GUI is to get VirtualBox Guest Additions installed.  It helps the guest communicate with the host in a more efficient manner which greatly improves the display and the mouse tracking.  Without it the mouse lag in the guest is horrid and is likely responsible for at least 300 of the 3,000 missing hair follicles on my big bald head.

While this SHOULD be a simple operation, the CentOS desktop installation makes it a multi-step process.   Selecting “insert Guest Additions CD” from the VirtualBox server menu after starting up the new box will mount the disk.   It will prompt to autorun the disk and then ask for the root user credentials.    The shell script starts running through the Guest Additions setup but it always falls while building the main Guest Additions module.     The reason is that kernel build kits are needed and they are not installed by default.    I will outline the typical user process here as a point of reference, though most often the first commands I run to fix the issue are those listed at the end of this section.  I’ve done this enough times to know what happens and don’t usually execute the autorun until AFTER I setup the kernel build kit.  You may want to do the same.

Here is what the output looks like after a default CentOS desktop install followed by an autorun of the Guest Additions CD:

Guest Additions Fail on CentOS
This is what happens when you don’t have Kernel build tools setup and try to run Guest Additions on VirtualBox.

[box type=”info” style=”rounded”]Mouse tracking driving you crazy? Toggle to a command line screen on any Linux box with CTRL-ALT-F2. Toggle back to the GUI with CTRL-ALT-F1.[/box]

With the mouse tracking driving me nuts I toggle over to the text console with ctrl-alt-F1 and login as root on there.   You can learn what broke the Guest Addition install by going to the log files:

more vboxxadd-install.log

The typical CentOS desktop build fails the Guest Additions install with this log:

/tmp/vobx.0/Makefile.include.header:97: *** Error: unable to find the sources of your current Linux kernel. Specify KERN_DIR= and run Make again. Stop.<br />Creating user for the Guest Additions.<br />Creating udev rule for the Guest Additions kernel module.<br />

With Guest Additions disabled and the VirtualBox not fully configured it is time to do some basic maintenance and get the kernel build environment available for Guest Additions.  Since I am logged in as root via the console I can start by getting yum updated, however the network connection does not always start up before Guest Additions is available.    The steps for getting the kernel dev in place:

Turn on the network interface eth0 (zero not oh) running:

ifup eth0

Make sure all of the installed software is updated to the latest revision:

yum update

Install the Linux kernel development files which are needed for the Guest Additions installation:

yum install kernel-devel

Install the development tool kit including compilers and other items needed to Guest Additions to hook into the kernel:

yum groupinstall "Development Tools"

Once you have the updates installed reboot the system with a shutdown -r now command while logged in as root.

The Guest Additions CD can now be mounted and autorun without error.

After running Guest Additions, reboot the server.

Turn On The Network At Boot

Now that the GUI is running and the mouse is tracking I can log in as the vagrant user and turn on the network connections.   Login, go  to System / Preferences / Network Connections on the main menu.    Check off “Connect Automatically” on the System eth0 connection.

Now the network will be enabled on boot.   That’s useful.

CentOS 6.5 Turn On Network At Boot
CentOS 6.5 turning on the network at boot.

Provide SSH Insecure Keypair To Vagrant

Best practices for Vagrant base boxes is to add an insecure keypair to the vagrant user.   While logged in as vagrant go to Applications/Systems Tools/Terminal to get to the command line.   Go the .ssh subdirectory and create the authorized_keys file by copying the public key from the Vagrant keypair repository into the authorized_keys file.

I use vim and copy the keypair content and paste it into the file.  You can use cat or other tools as well to get the content into the file.  Make sure not to introduce new whitespace in the middle of the key or it will not work.

Change the permissions of the authorized_keys file by using chmod, permission settings are very important for the file:

chmod 0600 authorized_keys 

Give Unrestricted Super Powers To Vagrant

Most users expect the vagrant login to have unrestricted access to all system commands. This is handled via the sudo application. CentOS restricts access by default and requires some updates to get it working per Vagrant best practices. Log back in to the command line console as root and edit the sudo file.

visudo

This brings up the vim editor with the sudo config file. Find the requiretty line and comment it out by adding a # before it. Then add the following line to the bottom of the file:

vagrant ALL=(ALL) NOPASSWD: ALL

Logout of the vagrant and root sessions and log back in as vagrant from the GUI. You should be able to open a terminal and run any sudo commands without a password prompt. You should also be able to run sudo commands “remotely” via the ssh connection to the system.

Make SSH Faster When DNS Is Not Available

If the host and/or virtual box cannot connect to the Internet the SSH access into the Vagrant virtual box will be slow.   Editing the sshd_config file and turning off DNS lookups will fix that.   Now that you have “vagrant super powers” you can do this by logging in as the vagrant user and opening the terminal:

sudo vim /etc/ssh/sshd_config

Add this line to the bottom of the file:

UseDNS no

Host To Guest SSH Access

Connecting from the host system to the guest system WITHOUT using the graphical login or console takes a couple of extra steps. To test the SSH connection I go back to my favorite SSH tool, PuTTY.     Before testing the connection the port forwarding needs to be setup on VirtualBox Manager.

  • Go to the new system listed on the VirtualBox Manager.
  • Right-click and select Settings.
  • Select Network.
  • Click the Port Forwarding button.
  • Add the following rule:
    • Name: SSH Local To Guest
    • Protocol: TCP
    • Host IP: 127.0.0.1
    • Host Port: 4567
    • Guest IP: leave this blank
    • Guest Port: 22

Save the settings.   Open PuTTY and connect to hostname 127.0.0.1 and change the port to be 4567.   You should get a login prompt.   Login with user vagrant.

VirtualBox SSH Port Forwarding
VirtualBox SSH port forwarding for Vagrant.

The issue with logging in with the vagrant private key file is that PuTTY only supports the proprietary PuTTY Private Key format.    You can download puttygen to convert the Vagrant private key file to the PuTTY Private Key file format (click to download the converted OpenSSH key in PPK format).

To use SSH keys in PuTTY, start a new session, enter 127.0.01 as the host and 4567 as the port, then set the PuTTY Private Key:

  • Click on “connection / SSH” in the left side menu to expand that selection.
  • Click on “Auth”.
  • Under Authentication parameters browse to your saved PPK file in the “Private key file for authentication” box.
Setting PuTTY Vagrant PPK
Setting PuTTY Vagrant PPK files.

Now you can connect with PuTTY and login by simply supplying a username.   This tells us that the remote vagrant command line should be able to execute all of the scripted setup commands without any issues.

Building A Box

Now that the basic system is in place it is time to “build the box”.   Vagrant has a command for doing this and if you’ve read my previous articles on setting up Vagrant you will know that I have a Windows command line shortcut that runs in my WP Development Kit folder.   With Vagrant already installed building a box is a one-line command.   I only need my machine name, which I’ve shorted to “CentOS6.5 GUI Base Box”.  Start up the Windows command line and run this:

vagrant package --base "CentOS6.5 GUI Base Box"

It will run for a while and eventually create a packaged Vagrant box ready for distribution.    By default the file will be named package.box.    I’ve renamed mine to centos6_5-gui-base.box for distribution purposes.   You can find it on my Vagrant Cloud account.

You can learn more about the box-building process via the Vagrant Creating A Base Box page.

Launching The Box

To launch the new box hosted on Vagrant Cloud I go to my local folder and execute these commands:

Download the image (stored on my Google Drive account) using Vagrant Cloud as a proxy:

vagrant box add charlestonsw/centos6.5-gui-base-box 

Create the vagrantfile that assists in the box startup command sequence:

vagrant init charlestonsw/centos6.5-gui-base-box

Start the box on VirtualBox:

vagrant up

By default, Vagrant starts boxes in headless mode, meaning no active console.   I want the GUI login so I shut down the box and find the vagrantfile to add the GUI startup line.    The command is already in the file and only needs a few lines to be uncommented to allow a GUI startup with a console.    Edit the vagrantfile and look for these lines:

config.vm.provider "virtualbox" do |v|
v.gui = true
end

There are few other comments in the default vagrantfile, you can leave the limits tweaks commented.  You will end up with a vagrantfile section that looks like this:


# Provider-specific configuration so you can fine-tune various
 # backing providers for Vagrant. These expose provider-specific options.
 # Example for VirtualBox:
 #
 config.vm.provider "virtualbox" do |vb|
 # Don't boot with headless mode
 vb.gui = true

 # # Use VBoxManage to customize the VM. For example to change memory:
 # vb.customize ["modifyvm", :id, "--memory", "1024"]
 end

Save the file and restart the box with the vagrant up box.

That’s it… a new Vagrant box.   Now on to the system tweaks to get my WP Dev Kit setup.

Posted on

Automated Virtual Box Creation V1.0 Notes

PuPHPet Banner

If you read my previous article,  WordPress Workflow : Automated Virtual Box Creation , you have an idea of what I am trying to accomplish with improving my WordPress development work flow.    The short version, I want to be able to create a fresh install of a virtual machine that has my entire development system intact with minimal input on my part.    The idea is to run a few commands, wait for the installs and updates, and be coding on a “clean” machine shortly after.    Once I get my own work flow updated I will also be able to share my scripts and tools via a git repository with the remote developers that are now working on Store Locator Plus add-on packs and hopefully simplify their development efforts or at least get all of us on a similar baseline of tools to improve efficiency in our efforts.

Here are my notes from the first virtual development box efforts via PuPHPet, Vagrant, and Puppet.    This build was done with recent “off-the-shelf” versions of each of these tools and using a base configuration with a handful of options from the PuPHPet site.

Headless Configuration

The VirtualBox machine appears to be created as a “headless” box, meaning no monitor or other display device is active.   I will need to tweak that as I work “on the box” with GUI development tools.    I know that I can install all of my development tools on my host system and read/write from a shared directory to get all of my work onto the virtual machine, but that is not my methodology.    Having worked with a team of developers I know all too well that eventually the host hardware will die.   A laptop will need to be sent off for repair.   Guess what happens?   You lose half-a-day, or more, setting up a new host with a whole new install of development tools.

The better solution, for my work flow, is to keep as much of the development environment “self contained” within the virtual box as possible.   This way when I backup my virtual disk image I get EVERYTHING I need in an all-in-one restore point.   I can also replicate and share my EXACT environment to any location in the world and be fully  “up and running” in the time it takes to pull down a 20GB install file.  In today’s world of super-fast Internet that is less of an issue than individually pulling down and installing a half-dozen working tools and hoping they are all configured properly.

What does this all mean?    I need to figure out how to get the PuPHPet base configuration tweaked so I can start up right from the VirtualBox console with a full Linux console available.  I’ll likely need to update Puppet as well to make sure it pulls down the Desktop package on CentOS.

I wonder if I can submit a build profile via a git pull request to PuPHPet.

Out-Of-Box Video Memory Too Low

The first hurdle with configuring a “login box” with monitor support will be adjusting the video RAM.   My laptop has 4GB of dedicated video RAM on a Quadro K3100M GPU.   It can handle a few virtual monitors and has PLENTY of room for more video RAM.   Tweaking the default video configuration is in order.

Since Vagrant “spins up” the box when running the vagrant up command the initial fix starts by sending an ACPI shutdown request to the system.     Testing the video RAM concept is easy.   Get to the VirtualBox GUI, right-click the box and select properties.   Adjust the video RAM to 32MB and turn on 3D accelerator (it makes the GUI desktop happy) and restart.

Looks like I can now get direct console login.  Nice!

PuPHPet Virtual Box with Active Console
PuPHPet Virtual Box with Active Console

Access Credentials

The second issue, which I realized after seeing the login prompt, is that I have NO IDEA what the login credentials are for the system.   This doesn’t matter much when you read/write the shared folders on your host to update the server and only “surf to” the box on port 8080 or SSH in with a pre-shared key, but for console login a username and password are kind of important.   And I have no clue what the default is configured as.  Time for some research.   First stop?  The vagrantfile that built the beast.

Buried within that vagrantfile, which looks just like Ruby syntax (I’m fairly certain it is Ruby code), is a user name “vagrant”.    My first guess?  Username: vagrant, password: vagrant.     Looks like that worked just fine.    Now I have a console login that “gets me around”, but it is not an elevated permissions user level such as root.   However, a simple sudo su – resolves that issue granting me full “keys to the kingdom”.

[box type=”info” size=”large” style=”rounded”]Vagrant Boxes Credentials are username vagrant, password vagrant[/box]

A good start.   Now to wreak some havoc to see what is on this box and where so I can start crafting some Puppet rule changes.   Before I get started I want to get a GUI desktop on here.

GUI Desktop

To get a GUI desktop on CentOS you typically run the yum package installer with yum groupinstall Desktop.    A visit under sudo su and executing that command gets yum going and pulling down the full X11/Gnome desktop environment.

A quick reboot with shutdown -r now from the root command line should bring up the desktop this time around… but clearly I missed a step as I still have a console login.  Most likely a missing startx command or something similar in the boot sequence of init.d.

A basic startx & from the command line after logging back in as vagrant/vagrant and my GUI desktop is in place, so clearly I need to turn on the GUI login/boot loader.

Tweaking PuPHPet Box Parameters

Now that I know what needs to change I need to go and create that environment via the PuPHPet/Vagrant/Puppet files so I can skip the manual tweaking process.   After some digging I found the config.yaml file.    When you use PuPHPet this file will be put in the .zip download you receive at the end of the PuPHPet process.   It is in the <boxid>/puphpet/ directory.

PuPHPet config.yaml
PuPHPet config.yaml

While some of the box parameters can be adjusted in these files, it appears much of the hardware cannot be manipulated.  There is a site called “Vagrant Cloud” that has multiple boxes that can be configured.   To switch boxes you can edit the config.yaml file and replace the box_url line to point to one of the other variants that may be closer to your configuration.  Since I don’t see one that is close to my needs it looks like I will have to build my own box profile to be hosted in the cloud.   That is content for another article.

 

Posted on

WordPress Workflow : Automated Virtual Box Creation

PuPHPet Vagrant Puppet Banner

I am into my first full day back after WordCamp Atlanta (#wcatl) and have caught up on most of my inbox, Twitter, and Facebook communications.   As I head into a new week of WordPress plugin production I decided now is as good a time as any to update my work flow.

I learned a lot of new things at WordCamp and if there is one thing I’ve learned from past experience it is DO NOT WAIT.   I find the longer I take to start implementing an idea the less chance I have of executing.

My first WordCamp Atlanta 2014 work flow improvement starts right at the base level.   Setting up a clean local development box.   I had started this process last week by manually configuring a baseline CentOS box and was about to setup MySQL, PHP, and all the other goodies by hand.  That was before I learned more about exactly what Vagrant can do.   I had heard of Vagrant but did not fully internalize how it can help me.  Not until this past weekend, that is.

My Work Environment

Before I outline my experience with the process I will share my plugin development work environment.

  • Host System: Windows 8.1 64-bit on an HP Zbook laptop with 16GB of RAM with a 600GB SATA drive
  • Guest System: CentOS 6.5 (latest build) with 8GB RAM on an Oracle VirtualBox virtual machine
    • Linux Kernel 2.6.32-431
    • PHP v5.4.23
    • MySQL v 14.14 dist 5.5.35
  • Dev Took Kit: NetBeans, SmartGit, Apigen and phpDoc, MySQL command line, vim
HP Zbook Windows 411
My Development System laptop config.

While that is my TYPICAL development environment, every-so-often I swap something out such as the MySQL version or PHP version and it is a HUGE PAIN.    This is where Vagrant should help.  I can spin up different virtual boxes such as a single-monitor versus three-monitor configuration when I am on the road or a box with a different version of PHP.     At least that is the theory anyway.   For now I want to focus on getting a “clean” CentOS 6.5 build with my core applications running so I can get back to releasing the Store Locator Plus Enhanced Results add-on pack this week.

Getting Started With Vagrant

The Rockin’ Local Development With Vagrant talk that Russel Fair gave on Saturday had me a bit worried as he was clearly on the OS/X host and the examples looked great from a command line standpoint.  Being a Linux geek I love command line, but I am not about to run virtual development boxes in in a VirtualBox guest.   Seems like a Pandora’s box to me… or at least a Russian doll that will surely slow down performance.   Instead I want to make sure I have Vagrant running on my Windows 8.1 bare metal host.    That is very much against my “full dev environment in a self-contained and portable virtual environment” standard, but one “helper tool” with configurations backed up to my remote Bitbucket repository shouldn’t be too bad, as long as I don’t make it a habit to put dev workflow tools on my host box. Yes, Vagrant does have a Windows installer and I’m fairly certain I won’t need to be running command-line windows to make stuff work.   If I’m running Windows I expect native apps to be fully configurable via the GUI.  Worst case I may need to open a text editor to tweak some files, but no command line please.

Here is the process for a Windows 8.1 install.

  • Download Vagrant.
  • Install needs to be run as admin and requires a system reboot.
  • Ok… it did something… but what?   No icons on the desktop or task bar or … well… anywhere that I can find!

Well… sadly it turns out that Vagrant appears to be a command line only port of the Linux/OSX variants.    No desktop icons, no GUI interface.   I get it.  Doing that is the fast and easy process, but to engage people on the Microsoft desktop you really do need a GUI.    Yes, I’m geek enough to do this and figure it out.   I can also run git command line with no problem but I am FAR more efficient with things like the SmartGit GUI interface.

Maybe I’m not a real geek, but I don’t think using command line and keyboard interaction as the ONLY method for interacting with a computer makes you a real techie.    There is a reason I use a graphical IDE instead of vim these days.    I can do a majority of my work with vim, but it is FAR more efficient to use the GUI elements of my code editor.

Note to Vagrant: if you are doing a windows port at least drop a shortcut icon on the desktop and/or task bar and setup a Windows installer.   Phase 2: consider building a GUI interface on top of the command line system.

It looks like Vagrant is a lower-level command line tool.   It will definitely still have its place, but much like git, this is a too on which other “helpers” need to be added to make my workflow truly efficient.  Time to see what other tools are out there.

Kinda GUI Vagrant : PuPHPet

Luckily some other code geeks seem to like the idea of  GUI configuration system and guess what?   Someone created a tool called PuPHPet (which I also saw referenced at WordCamp so it must be cool)  and even wrote an article about Vagrant and Puppet.   Puppet is a “add on”, called a provisioner,  to setup the guest software environment.

PuPHPet is an online form-based system that builds the text-file configuration scripts that are needed by Vagrant to build and configure your Virtualbox (or VMWare) servers.   It is fairly solid for building a WordPress development environment, but it does mean reverting back to CentOS 6.4 as CentOS 6.5 build scripts are not online.     Though I am sure I can tweak that line of the config files and fix that, but that takes me one-step away from the “point and click” operation I am looking for.

Either way, PuPHPet, is very cool and definitely worth playing with if you are going to be doing any WordPress-centric Vagrant work.

PuPHPet Intro Page
The PuPHPet online configuration tool for creating Vagrant + Puppet config files.

 

Puppet Makes Vagrant and PuPHPet Smarter

Now that I have Vagrant installed and I discovered PuPHPet I feel like I am getting closer to a “spin me up a new virtual dev box, destroy-as-desired, repeat” configuration.  The first part of my workflow improvement process.   BUT…. I need one more thing to take care of it seems… get Puppet installed.   I managed to wade through the documentation (and a few videos) to find the Windows installers.

Based on what is coming up in the install window it looks like the installer will roll out some Apache libs, ruby, and the windows kits that help ruby run on a windows box.

Puppet Install Licenses
The Puppet installer on Windows.

Again, much like Vagrant, Puppet completes the installation with little hint of what it has done.    Puppet is another command line utility that runs at a lower-level to configure the server environments.   It will need some of the “special sauce” to facilitate its use.     A little bit of digging has shown that the Puppet files are all installed under the C:\Program Files (x86)\Puppet Labs folder.    On Windows 8.1 the “Start Menu” is MIA, so the documentation about finding shortcuts there won’t help you.    Apparently those shortcuts are links to HTML doc pages and some basic Windows shell scripts (aka Batch Files) so nothing critical appears to have gone missing.

The two files that are referenced most often are the puppet and facter scripts, so we’ll want to keep track of those.   I’ll create a new folder under My Documents called “WP Development Kit” where I can start dumping things that will help me managed my Windows hosted virtual development environment for WordPress. While I’m at it I will put some links in there for Vagrant and get my PuPHPet files all into a single reference point.

WP Dev Kit Directory
The start of my WP Dev Kit directory. Makes finding my PuPHPet, Vagrant, and Puppet files easier.

Now to get all these command line programs to do my bidding.

Getting It Up

After a few hours or reading, downloading, installing, reading some more, and chasing my son around the house as the “brain eating dad-zombie”, I am ready to try to make it all do something for me.    Apparently I need to use something called a “command line”.  On Windows 8.1.

I’m giving in with the hopes that this small foray into the 1980’s world of command line system administration will yield great benefits that will soon make me forget that DOS still exists under all these fancy icons and windows.   Off to the “black screen of despair”, on of the lesser-known Windows brethren of the “blue screen of death”.     Though Windows 8 tries very hard to hide the underpinnings of the operating system, a recent Windows 8 patch and part of Windows 8.1 since “birth” is the ever-useful Windows-x keyboard shortcut.   If you don’t know this one, you should.   Hold down the Windows key and press x.   You will get a Windows pop-up menu that will allow you to select, among many other things, the Command Prompt application.

If you right-click on the “do you really want to go down this rabbit hole” confirmation box that comes up with the Command Prompt (admin) program you will see that it is running C:\Windows\system32\cmd.exe.     This will be useful for creating a shortcut link that will allow me to not only be in command mode but also to be in the “source” directory of my PuPHPet file set.    I’m going to create a shortcut to that application in my new WP Development Kit directory along with some new parameters:

  • Search for cmd.exe and find the one in the Windows\system32 directory.
  • Right-click and drag the file over to my WP Development Kit folder, selecting “create shortcuts here” when I drop it.
  • My shortcut to cmd.exe is put in place, but needs tweaking…
  • Right-click the shortcut and set the “Start in” to my full WP Development Kit folder.

Now I can double-click the command prompt shortcut in my WP Development Kit folder and not need to change directory to a full path or “up and down the directory tree” to get to my configuration environment.

Running Vagrant andn Puppet via PuPHPet Scripts
Running Vagrant andn Puppet via PuPHPet Scripts

A few key presses later and I’ve managed to change to my downloaded PuPHPet directory and execute the “vagrant up” command.   Gears starting whirring, download counters started ticking, and it appears the PuPHPet/Vagrant/Puppet trio are working together to make something happen.  At the very least it is downloading a bunch of stuff from far away lands and filling up my hard drive.   Hopefully with useful Virtualbox disk images and applications required to get things fired up for my new WordPress dev box.

We’ll see…

Link Summary