Posted on

VVV For WordPress Development

Banner Vagrant Cloud WP Dev Kit Box

Now that 4.3 has been released I am taking a few days to reconfigure my production environment for WordPress plugin development. After a number of issues running the self-contained GUI development environment that included WordPress core, full plugin development, and all of my supporting infrastructure including phpStorm, SmartGit, and a series of Grunt scripts in an all-in-one Vagrant-based Virtualbox, I decided to try something new. Losing 20 minutes every few days because the self-contained GUI in Virtualbox could not sync between guest and host was too much. Something in the Virtualbox or CentOS 7 upgrades over the past year broke something fundamental in GUI I/O and I’ve been unable to track it down. Time for a change.

My change? Learning Varying Vagrant Vagrants. For those that are not familiar with VVV for WordPress development you may want to check it out here: https://github.com/Varying-Vagrant-Vagrants/VVV.

What is VVV?

VVV is a virtual development environment for WordPress.    It spins up a headless (no GUI interface) Virtualbox that contains three separate versions of WordPress (stable, dev, and trunk) as well as a myriad of tools like phpMyAdmin.   All of the settings are in place to allow your local system, my OS/X desktop for my setup, to interact with the local WordPress install from your preferred web browser.

The upside is there is a lot of community support , articles, and various tools-and-trick available to you for doing almost anything you want.    A lot of the “cool dev tricks” I never had fully working, like interactive XDebug support in phpStorm, are readily available.     It is also super-easy to switch between WordPress releases which is cool if you are sending core patches or need to test on the upcoming major release.

The downside is that you need to setup all of your development tools locally on your desktop.   Guess what happens if your computer dies?   Yup, another few hours of setting it up again.   With my prior custom self-contained virtual environment I only need to save my Virtualbox, usually by creating a Vagrant image, any time I made notable changes to my tool kit; by doing so I could restore it easily to ANY desktop ANYWHERE in the world and have EXACTLY the same environment in no more time than it takes to spin up a VVV based box.

In short, VVV is a virtual machine store on your local desktop with several WordPress installs ready-and-waiting behind your browser screen.

My Startup Tricks

I develop a number of WordPress plugins, so having full development tools and my source code are key to productivity.  Here are some things I needed to tweak on the default VVV setup to get going.

Linking Plugin Source

I am primarily developing plugins and I want them on all of the WordPress installs provided by VVV.  I can “take over” a VVV server-based directory with a local directory my mapping the local directory to the destination with a Vagrant Customfile.    Go to the base location where you placed your VVV install, you will know you are in the right place as it has the Vagrantfile for the VVV box, and create a new file named “Customfile”.

Here is my mapping entries to take over the plugin directory on all 3 WordPress installs that come with VVV:


config.vm.synced_folder "/Users/lancecleveland/Store Locator Plus/plugin_code", "/srv/www/wordpress-default/wp-content/plugins", :owner => "www-data", :mount_options => [ "dmode=775", "fmode=774" ]

config.vm.synced_folder "/Users/lancecleveland/Store Locator Plus/plugin_code", "/srv/www/wordpress-develop/wp-content/plugins", :owner => "www-data", :mount_options => [ "dmode=775", "fmode=774" ]

config.vm.synced_folder "/Users/lancecleveland/Store Locator Plus/plugin_code", "/srv/www/wordpress-trunk/wp-content/plugins", :owner => "www-data", :mount_options => [ "dmode=775", "fmode=774" ]

Configuring XDebug

phpStorm comes with an XDebug listener.  This allows you to set breaks in your PHP code files, inspect variables in real-time, and do a lot of other things that are much more efficient than var_dump or print_r or die littered throughout the code.     The are a number of articles and videos on using XDebug with phpStorm.  Check it out, it is a great debugging tool.   For now, how to enable it with VVV:

Turning on XDebug is easy with VVV.

Go to the VVV install directory.

Enter Vagrant via SSH: vagrant ssh

Turn on Xdebug from the SSH command line on the virtual machine: xdebug_on

That’s it, I can now use my  local phpStorm tool to debug my VVV files.

Here is THE XDebug + VVV + phpStorm video to watch to do this.

Useful Meta

With VVV installed these URLs should work in your browser.

Users and passwords:

  • For the WP installs:  wp / wp
  • For WP Admin Users: admin / password
  • MySQL Root: root / root
    • Default DB Name: wordpress_default
    • Trunk DB Name: wordpress_trunk
    • Develop DB Name: wordpress_develop

Paths:

  • Local directories (relative to Vagrant install): ./www
  • Server-Side directories: /srv/www
Posted on

Selenium IDE Running Suites of Suites

Selenium Add On Banner

For over a year now I’ve been running over a dozen Selenium IDE test suites every time I update the Store Locator Plus base plugin.  It is a manual process that is time consuming, though less consuming than manual testing by several orders of magnitude.   Today I learned how to be even more efficient with my time, which my forthcoming customer support and QA team will hopefully appreciate when they come on-board this summer.

Here is the Stack Overflow summary I posted on my automated “suite of suites” process which took me several days of searching and testing to discover.

I have a few dozen test suites built in Selenium IDE to assist with testing my Store Locator Plus WordPress plugin. Sometimes I need to run a single Selenium test suite. However when I release a new version of the base plugin I want to run a dozen test suites one-after-another.

While not a perfect fit for your use case of creating several “master suites”, I did find a pair of Selenium IDE plugins that allow me to create a single “favorites list of suites” and run all of my favorites back-to-back.

It may be possible to investigate & modify the plugin JavaScript to create several different “favorites lists” that may suit your needs. In the meantime you can get at least one “master list of suites” by combining these Selenium IDE add-ons:

After installing each of these add-ons (technically Mozilla Firefox plugins) you will see a favorites button inside the Selenium IDE interface. Mark your favorite suites and you will have your “list”. You can now select “Favorites / Run All” from the Selenium IDE menu.

You may want to be careful about the sequence in which you mark your favorites. I marked them in the order I wanted them to run. Open test suite #1, favorite, test suite #2 favorite etc. then “run all”. Worked great and shows me the total run count and fail count across all suites (and thus tests) that were executed. The log, sadly, appears to be reset at each suite however.

Posted on

Selenium IDE Includes AKA “Rollups”

Selenium IDE is used extensively to test the Store Locator Plus family of plugins.    As the testing continue to expand so do the rules being tested.   Today the addition of a rule that checks that the PHP notice “use of undefined constant” needed to be added.     There is an existing set of other PHP warnings and notices that was added last month that is in 30+ Selenium Scripts.  They were all copied by hand into each test.

Now I need to add one more rule to that test set.   Copy and paste into 30 files?  There HAS to be a better way.

Turns out there is a better way, but it requires a little JavaScript coding to make it happen.   Selenium IDE does not have an “include <file>” option in the base set of commands.   Personally I think they need to add it to the base command set as it will make it far easier for people to write “test subsets” and then include them in every test.    The solution is using something called a “rollup”.    A rollup is a label for a group of commands you want to execute in many places in your test scripts.

My original test suites looked something like this:

Selenium Test for PHP Warnings
Selenium Test for PHP Warnings

In the “old days”, yesterday to be precise, I would copy-and-paste this set of Selenium IDE tests into a LOT of test files.

Today I decided to be smart about it, happens every now-and-then… must be a full moon or something, and find out how to to an “include” of those commands.   The trick is to create a file called something like “rollups.js”.     My file is called slp_rollups.js.     It is a standard JavaScript file that I place in the same directory as all of my Selenium IDE test scripts (which happen to be nothing more than HTML snippets).

To replace those 4 commands with a rollup I created this slp_rollup.js file:

/**
 * For use in Selenium IDE.
 *
 * You will need to add this to Selenium under Options / Options in the Selenium Menu.
 * Put this under Selenium Core Extensions:
 * ~/selenium-ide-for-slp/sideflow.js , ~/selenium-ide-for-slp/slp_rollups.js
 *
 *
 */
var manager = new RollupManager();


/**
 * check_for_syntax_errors
 *
 * This rollup tests for php syntax errors, warnings, and notices.
 */
manager.addRollupRule({
    name: 'check_for_syntax_errors',
    description: 'Check for PHP syntax errors, notices, warnings.',
    args: [],
    commandMatchers: [],
    getExpandedCommands: function(args) {
        var commands = [];

        commands.push({
            command: 'assertNotText',
            target: '//body',
            value: '*Notice: Undefined*'
        });

        commands.push({
            command: 'assertNotText',
            target: '//body',
            value: '**Notice: Trying to get*'
        });

        commands.push({
            command: 'assertNotText',
            target: '//body',
            value: '*Notice: Use of*'
        });

        commands.push({
            command: 'assertNotText',
            target: '//body',
            value: '*Fatal error:*'
        });

        return commands;
    }
});

To activate this I update Selenium IDE by going to Options/Options and adding this new slp_rollup.js file to the Selenium Core Extensions. Since I also use the sideflow.js file to provide Go To / If and other looping constructs I add both sideflow.js and slp_rollups.js to my extensions list by separating the file names with a comma.

Selenium IDE Options
Selenium IDE Options with sideflow and slp_rollups enabled.

Now I can replace that block of 4 commands with the following single command in ALL 30 scripts. The best part is the next time I need to add another test for a new warning or error I only edit ONE file, the slp_rollups.js file which means less editing, less copy & paste, and less commits to the git repository.

Selenium IDE Implementing Rollups
Selenium IDE Implementing Rollups
Posted on

Getting Started With SLPDev Cent 7 Vagrant Box

This article is for the Store Locator Plus development team that is helping maintain the Store Locator Plus plugin family.   It is a basic outline on how to get the SLPDev Cent 7 virtual machine, a private distribution given to qualified developers, personalized and ready for use to start working on the Store Locator Plus plugins.

Get The Box

The first step is to get the latest SLPDev Cent 7 Vagrant box image from Charleston Software Associates.   The image is approximately 2GB and requires a working Vagrant installation on your laptop or desktop computer along with a working install of VirtualBox.   You will use the provided image to create a new VirtualBox machine with the a self-contained GUI development environment for the Store Locator Plus WordPress plugin.

What Is On The Box

The box is configured according to the CentOS 7 WP Dev Kit specification with a few additions.    An ssh key has been configured to provide easy access to the repositories. The WordPress installation has been completed with a site title “SLPDev” and an admin user login of vagrant / vagrant.   You get to the WordPress site by opening Firefox via Applications/Favorites and surfing to http://localhost/.

All of the current Store Locator Plus plugins are installed via the BitBucket git repositories including:

  • Store Locator Plus
  • Contact Extender
  • Directory Builder
  • Enhanced Map
  • Enhanced Results
  • Enhanced Search
  • Event Location Manager (in development, debaat/CSA)
  • Janitor
  • Location Extender
  • Pro Pack
  • Real Estate Extender (in development, aknight/CSA)
  • Social Media Extender (debaat)
  • Store Pages
  • User Managed Locations (debaat)
  • Widgets

The Selenium IDE test suite is installed in the vagrant home directory as is the WordPress Dev Kit with the Store Locator Plus publication scripts.

Personalize The Box

Before starting development you will want to change several identifiers so your code updates can be attributed to you.    You will need to run SmartGit and enter your BitBucket username and password credentials to access the repositories.    You will also want to configure git to set your username and email as the default commit author.

Git / SmartGit Update

Run the following commands from the Linux command line terminal (Applications / Favorites):


git config --global user.email 'your@email.com'

git config --global user.name 'Your Name'

The next thing you should do for SLP development is open the SmartGit repository and pull (rebase, not merge as the default mode) and fetch the latest updates for any plugins you are going to work on.

Posted on

Good git Behavior for the CSA Code Team

Man Branches Banner

I am in the midst of training and evaluating people that are interested in working on the Store Locator Plus code projects.   Some people are veteran coders.  Others are learning coding in a team environment for the first time.    Here are some basic guidelines on using git when working on the CSA codebase that can be useful if you are using git version control for any team project.

Use Descriptive Commit Messages

Your commits messages should be descriptive at least 99% of the time.   These are not good commit messages:

Inline image 1
When the team or project lead goes to integrate your code with the work everyone else has contributed they should have some clue what each commit was doing.     A single word is NEVER a good commit message.   At the very least you should almost always be using a noun and a verb in a commit message.   Debug actions would be at least a little better than “debug”.   Better yet, us a short sentence.

Create Separate Debugging Branches

Since we are on the topic of debug commits, they should rarely be in  your “mainline” branch of code.  Yes debugging happens.    Yes, they often end up in a series of commits on a code branch especially when working on a complex piece of code.  However if you start out with a cycle of coding where you know “I’m going to do a lot of debugging to figure out what is going on here” then it is almost always a good idea to start by creating a new “debug_this_whacky_thing” branch and dumping all your “code barf” in there until you figure things out.
When you do, go back to the “real_work” branch and check that out and put the valuable pieces of code from your learned lessons in that branch.
If you manage to stumble across a useful piece of code on your “testing_stuff” branch you can always add it on to your “real_work” branch with something called “cherry picking”.    That is a git command and in SmartGit is simple to execute.  Checkout the real_work branch, then go select the one or two commits that did something useful from the debugging_code_barf branch and “cherry pick” them.

Commit Often

Small frequent commits are better with just about any version control system and ESPECIALLY when using git.   It tends to create fewer code conflicts during a merge.     This does not mean committing every single line of code on a per-commit basis.   However you should commit every time you write code that changed something and are at a “stopping point”.    Typically this is at the point of “ok I am going to test this now and see if it does what I expected”.    Often it is best to do a “dry run” and make sure there are no blatant errors such as syntax errors before committing.     Try to commit unbroken, if not functional, code.    In other words it should not crash whatever you are working on with an immediate and obvious “your forgot a curly bracket” error.

Use Branches

Like the debugging branch noted above, any time you start a new concept, path,  model, design, or feature start a new branch.   Try to work from a root point, such as the last major release of a product or the last tested-to-be-working version of the software.    Unless your new concept requires the code of a prior effort going back to the root “last working base copy we published” is a good starting point.    The project or team lead will merge code into a prerelease or production (master) branch or an integration branch to create a new product release version.
If you have done work on several feature branches that are not dependent on each other but work better together, create your own integration branch.   “my_super_pack” branch can be a merge-commit of your “feature_a”, “super_awesome_feature”, and “feature_b” branches.

CSA Branch Standards

At CSA I like to use a 3-branch methodology for managing projects.    The branches are master, prerelease, and integration. All 3 branches are only aligned when a new production version is released and there is no ongoing development on the project.
master – always points to the latest production release available to the general public.   This is where the current commit pointer ends up after integration and prerelease phases are complete and the production scripts are executed.  This branch is always tagged with the current production release number.  Developers only start new branches here if a prerelease branch does not exist.
git master branch
prerelease – always points to the latest release of the software that was published to the public in prerelease format.  Prerelease software is not fully tested, though usually has passed the rudimentary functional testing.  this is considered the “beta” version of the next master release.  All CSA Premier Members and beta test groups are given access to prerelease software.    This branch is always tagged with the current software version number, which is bumped if further changes are needed before “going to production”.   Developers almost always start new branches here.
git prerelease branch
integration – this branch points to the current integration branch used by the project manager to pull together developer commits in preparation for conflict resolution and rudimentary software testing prior to being given an official “prerelease” stamp.  This is the release used for internal testing and development and should be considered unstable.    Developers rarely start new code branches on this branch.
git integration branch

 

Posted on

Adding gotoIf and other Flow Control to Selenium IDE

selenium ide

Some of my Selenium IDE test scripts make use of the gotoIf command.   Flow control logic like the gotoIf and label statements are not part of the standard Selenium IDE core library.   Like most apps these days, Selenium IDE has a method for extending the base functionality with plugins and core extensions.   The “sideflow” core extension from darenderidder is an extension that provides the oft-referenced goto controls for Selenium IDE.

Adding sideflow Flow Control To Selenium

I like to keep current with any repository updates, so I use git to clone the repository into my documents folder on my Vagrant WP Dev Kit CentOS box.    Using SmartGit I clone via the github URL:

https://github.com/darrenderidder/sideflow.git

Open Selnium IDE, go to the Options menu and select options.  Under the Selenum Core extensions browse to the sideflow.js file that was cloned via git.

Installing Selenium IDE SideFlow Extension
Installing Selenium IDE SideFlow Extension

The other option is to just download the sideflow.js file here.   No promises or guarantees this is current or will work on every system.  You should really grab the code from the source.

Example Selenium IDE Script

Here is an example script that uses flow control to ensure only ONE Store Locator Plus locations page is added. If the page already exists it skips the page setup process on my test site.

<tr><td rowspan="1" colspan="3">SLP - Add Page, SLPLUS</td></tr>
</thead><tbody>
<tr>
	<td>setSpeed</td>
	<td>200</td>
	<td></td>
</tr>
<!--Open WP Pages Interface-->
<tr>
	<td>open</td>
	<td>/wp-admin/edit.php?post_type=page</td>
	<td></td>
</tr>
<tr>
	<td>waitForElementPresent</td>
	<td>id=doaction2</td>
	<td></td>
</tr>
<tr>
	<td>storeElementPresent</td>
	<td>xpath=//table[@class='wp-list-table widefat fixed pages']//td[@class='post-title page-title column-title']//a[contains(text(),'Locations')]</td>
	<td>slplus_page_exists</td>
</tr>
<tr>
	<td>gotoIf</td>
	<td>storedVars['slplus_page_exists']</td>
	<td>SKIP_LOCATION_PAGE_CREATION</td>
</tr>
<tr>
	<td>clickAndWait</td>
	<td>css=a.add-new-h2</td>
	<td></td>
</tr>
<tr>
	<td>type</td>
	<td>id=title</td>
	<td>Locations</td>
</tr>
<tr>
	<td>click</td>
	<td>id=content-html</td>
	<td></td>
</tr>
<tr>
	<td>type</td>
	<td>id=content</td>
	<td>[[slplus]]</td>
</tr>
<tr>
	<td>waitForElementPresent</td>
	<td>id=publish</td>
	<td></td>
</tr>
<tr>
	<td>clickAndWait</td>
	<td>id=publish</td>
	<td></td>
</tr>
<tr>
	<td>label</td>
	<td>SKIP_LOCATION_PAGE_CREATION</td>
	<td></td>
</tr>
Posted on

Improved Grunt Tasks for the Vagrant WordPress Dev Box

Grunt WordPress Dev Kit

Last week I found myself having to rebuild my WordPress plugin development box after a “laptop fiasco”.   While it was a lot of work it feels as though I am in a better position to not only recover my environment quickly but also distribute it to other developers that are interested in assisting with plugin development.

If you are interested you can read more about it in the related WordPress Workflow and related WordPress Development Kit articles.

This morning I realized that having a new almost-fully-configured Vagrant box for my WordPress Development Kit allows me to make assumptions in my Grunt tasks.    While it would be more flexible to create options-based tasks where users can set their own configuration for things like MySQL usernames and passwords, the WP Dev Kit Vagrant box assumption allows me to bypass that for now and come back to it when time allows.  Fast turnaround and fewer interruptions in my already-busy work flow is paramount this week.

Today’s WordPress Dev Kit Updates

The official tag I’ve assigned to the newest WordPress Dev Kit is version 0.5.0.  Here is what has been added.

WordPress Database Reset

One of the tasks I do fairly often is to “clear the data cruft” from my development box WordPress tables.  I  accomplish this by dropping the WordPress database and recreating it.

The Vagrant box makes this far easier as I know that when I spin up the WP Dev Kit Vagrant box it already has the WordPress MySQL tables setup.  I also know the username and password.  As such I can execute a simple drop/create table as the privileges are already in place in the meta data for MySQL and will carry over.   Thus I only need to execute a single mysql-cli command to get the data reset.

To get this working in Grunt I added the grunt-ssh module and created a ‘resetdb’ target.

I can now reset my WordPress table with a simple grunt command:


$ grunt shell:resetdb

Online Documentation

The other change I made today will help me remember how the heck all this stuff works.  Now that the dev kit has grown to a couple of commands I know I will soon be forgetting the nuances to certain build and workflow processes.   I started creating my own Markdown files I realized that Bitbucket has a system for using .md files on the repository wiki.    The easy solution was to add the Bitbucket wiki as a submodule to the WP Dev Kit repository and edit the file there.    Doing so means that any doc update will also be published immediately when pushed back to the repo at the WP Dev Kit Bitbucket Wiki.

Now back to getting the Store Locator Plus and Enhanced Results first-pass testing run and prerelease copies published for my Premier Members.

Posted on

Improving WordPress Plugin Development with Sass

SLP Sass Banner

If you’ve been following along since my WordCamp Atlanta trip this spring you know that I’ve been working on automating my WordPress plugin development and production process.    If you missed it you can read about it in the WordPress Workflow and WordPress Development Kit articles.   Since I needed to patch some basic CSS rules in my Store Locator Plus themes, these are plugin “sub-themes” that style the store locator interface within a page, I decided now was the time to leverage Sass.

Sass Is In The House

It was one of my first sessions at WordCamp Atlanta and I KNEW it was going to be part of my automation process.    Sass is a CSS pre-processor.   Store Locator Plus has its own “theme system”, a sort of plugin sub-theme that lives within the WordPress over-arching site theme.     The SLP themes allow users to tweak the CSS that renders the search form, map, and results of location searches to create in-page layouts that better fit within their WordPress theme layout.

Until this past release it was a very tedious process to update themes or create a new theme.    In the latest release there are some 30-odd SLP theme files.    The problem is that when I find an over-arching CSS issue, like the update to Google Maps images that rendered incorrectly on virtually EVERY WordPress Theme in existence, it was a HUGE PAIN.   I was literally editing 30 files and hoping my cut-and-paste job went well.   Yes, I could have done CSS include statements but that slows things down by making multiple server requests to fetch each included CSS file.   Since the store locator is the most-visited page on many retail sites performance cannot be a secondary consideration.   Sass deals with that issue for me and brings some other benefits with it.

There are PLENTY of articles that describe how to install Sass, so I am not going to get into those details here.  On CentOS it was a simple matter of doing a yum install of ruby and ruby gems and a few other things that are required for Sass to operate.  Google can help you here.

My Sass Lives Here…

For my current Sass setup I am letting NetBeans take care of the pre-compiling for me.    It has a quick setup that, once you have Sass and related ruby gems installed, will automatically regenerate the production css files for you whenever you edit a mixin, include, or base SCSS file.

NetBeans Sass Setup
NetBeans Sass Setup

I combine this with the fact that the assets directory is ignored by the WP Dev Kit publication and build tasks to create a simple production environment for my CSS files.   I store my SCSS files in the ./assets/stylesheets directory for my plugin.   I put any includes or mixin files in a ./assets/stylesheets/include subdirectory.     I configure NetBeans to process any SCSS changes and write out the production CSS files to the plugin /css directory.

The first thing I did was copy over a few of my .css files to the new stylesheets directory and changed the extension to .scss as I prepared to start building my Sass rules.

Includes

I then ripped out the repeated “image fix” rules that existed in EVERY .css file and created a new ./stylesheets/include/_map_fix.scss file.     This _map_fix file would now become part of EVERY css file that goes into production by adding the line @include ‘include/_map_fix” at the top of the SLP theme .scss files.    Why is this better?   In the past, when Google has made changes or WordPress has made changes, I had to edit 30+ files.  Now I can edit one file if a map image rule is changing that has to be propagated to all of the css files.   However, unlike standard CSS includes Sass will preprocess the includes and create a SINGLE CSS file.   That means the production server makes ONE file request instead of two.  It is faster.

SLP Map Fix Include
SLP Map Fix Include

As I reiterated this process I ended up with a half-dozen CSS rules that appear in MOST of my CSS files.    Since all of the rules do not appear in all of my plugin theme files I ended up with a half-dozen separate _this_or_that scss files that could be included in a mix-and-match style to get the right rule set for each theme.     I also created a new _slp_defaults include file that does nothing more than include all of those half-dozen rules.  Nearly half of the current CSS files use all of rules that were “boiled out of” the CSS files.

Store Locator Plus Default Includes
Store Locator Plus Default Includes

Mixins

Along the way I learned about mixins.   At first I was a bit confused as to the difference between include files and mixins.  Both are “pulled in” using similar commands in SCSS, @import for the “include files” and @include for the mixins, but what was the difference?    While you can likely get away with “faking it” and having mixins act like includes they serve different purposes.   I like to think of a mixin as a “short snippet of a CSS rule”.

A common example is a “set the border style mixin”.  In the simplest form it can set the border style with a rule for each of the browser variants.  This rule is not a complete CSS rule but rather a portion of a CSS rule that may do other styling AND set a border.    The mixin includes the -moz and other special rule sets to accomodate each browser.   Rather than clutter up a CSS entry with a bunch of border-radius settings, use a mixin and get something like:

.mystyle {
   @include mixin_border_radius;
   color: blue;
}

That is a very simplistic way of using a mixin. One advantage is that if you decide to change the default border radius settings in all of your CSS files you can edit a single mixin file. However that is not a typical use. Yes, you can create subsets of CSS rules, but it really gets better when you add parameters.

At a higher level a mixin is more than just a “CSS rule snippet”. It becomes more like a custom PHP function. In typical coder fashion, I snarfed this box-shadow mixin somewhere along the way:

// _csa_mixins.scss

@mixin box-shadow( $horiz : .5em , $vert : .5em , $blur : 0px , $spread : 0px , $color : #000000 ){
    -webkit-box-shadow: $horiz $vert $blur $spread $color;
    -moz-box-shadow: $horiz $vert $blur $spread $color;
    box-shadow: $horiz $vert $blur $spread $color;
}
@mixin csa_default_map_tagline {
    color: #B4B4B4;
    text-align: right;
    font-size: 0.7em;
}
@mixin csa_ellipsis {
  overflow: hidden;
  text-overflow: ellipsis;
  white-space: nowrap;
}

Since that rule is part of my default _csa_mixins that I tend to use in multiple places I use it as follows:


@import 'include/_csa_mixins';
@import 'include/_slp_defaults';

//blah blah boring stuff here omitted

// Results : Entries (entire locations)
.results_entry {
    padding: 0.3em;
    @include box-shadow(2px, 4px, 4px, 0px, #DADADA);
    border-bottom: 1px solid #DDDDDD;
    border-right: 1px solid #EEEEEE;
}

Notice how I now call in the include with parameters. This is passed to the mixin and the Sass preprocessor calculates the final rules to put in the CSS file. This makes the mixin very flexible. I can create all sorts of different box shadow rules in my CSS files and have the cross-browser CSS generated for me. No more editing a dozen box shadow entries every time I want to change a shadow offset.

Here is what comes out in the final production CSS when using the above mixin. You can see where the parameters are dropped into the .results_entry CSS rule:

.results_entry {
  padding: 0.3em;
  -webkit-box-shadow: 2px 4px 4px 0px #dadada;
  -moz-box-shadow: 2px 4px 4px 0px #dadada;
  box-shadow: 2px 4px 4px 0px #dadada;
  border-bottom: 1px solid #DDDDDD;
  border-right: 1px solid #EEEEEE; }

This is only a start of my journey with Sass and I’ve barely scratched the surface. However I can already see the benefits that are going to come from using Sass. In fact I already used it to fix a problem with cascading menus where one of the SLP theme files did not contain a rule set. Rather than copy-paste from another theme file that contained the proper rules I only needed to add @import ‘include/_slp_tagalong_defaults’ and the problem was fixed.

Going forward Sass will not only increase my throughput in CSS development but also improve the quality of the final product that reaches the customer.

My first SLP Theme file, Twenty Fourteen 01, that was built using these new Sass files is only a 136 line CSS file with LOTS of whitespace and comments.  When the final processing is finished it has all of the rules necessary to support the current add-on packs and style them nicely for the WordPress Twenty Fourteen theme in all major browsers.

A new SLP Theme: Twenty Fourteen Rev 01
A new SLP Theme: Twenty Fourteen Rev 01

 

Posted on

WordPress Dev Kit : Grunt 0.3.0 and Plugin 0.0.2

Grunt WordPress Dev Kit

More refinements have been made this week to my WordPress Workflow and related WordPress Development Kit.  With new products going into production shortly and some older products coming out with new releases, I realized I needed a more efficient way to publish prerelease copies.  As part of the Premier membership program I am trying to get stable prerelease products in the hands of those Premier members that want them.   Some members like to test new releases or try out new features on their test systems before they come out.    It allows them to plan for future updates and provides an opportunity for feedback and updates before the new version is released.  A win-win for the Premier member and for Charleston Software Associates.

In order to support a formal prerelease and production configuration I realized I needed to be able to track two different versions and release dates separately.   Following the general format presented in other Grunt examples, this meant setting up new sub-sections within the plugins.json pluginMeta structure.   The new format looks something like this:

"wordpress-dev-kit-plugin": {
"production": {
"new_version": "0.0.02",
"last_updated": "2014-04-03"
},
"prerelease": {
"new_version": "0.0.03",
"last_updated": "2014-04-04"
},
"publishto" : "myserver"
}

Changing the structure meant updating both the Gruntfile.js for the development kit as well as the parser that is in the WordPress Development Kit plugin. The changes were relatively minor to address this particular issue, but I did learn some other things along the way.

Tasks and Targets

In my own Grunt tasks I had been calling one of my parameters in my build sequence the “type”, as in the “build type”. However the configuration file examples online often talk about a “target”. A target would be something like “production” or “prerelease” that shows up in a configuration block like this one:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
path: ‘<%= myServer.path %><%= grunt.task.current.target %>/’,
srcBasePath: "../public/<%= grunt.task.current.target %>/",
showProgress: true
},
production: { expand: true, cwd: "../public/<%= grunt.task.current.target %>/", src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"] },
prerelease: { expand: true, cwd: "../public/<%= grunt.task.current.target %>/", src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"] }
},

I have updated my scripts and documentation terminology to refer to this parameter as the “target” to follow convention.

Simplify Congiguration With grunt.task.current.target

I learned a new trick that helps condense my task configuration options. In one of my interim builds of the WordPress Dev Kit I had something that looked more like this:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
showProgress: true
},
production: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %>production/’,
srcBasePath: "../public/production/",
},
prerelease: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %>prerelease/’,
srcBasePath: "../public/prerelease/",
},
},

A bit repetitive, right? I found you can use the variable grunt.task.current.target to drop the current task name as a string into a configuration directive:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
showProgress: true
},
production: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %><%= grunk.task.current.target %>/’,
srcBasePath: "../public/<%= grunk.task.current.target %>/",
},
prerelease: {
expand: true,
cwd: "../public/<%= grunt.task.current.target %>/",
src: ["<%= currentPlugin.zipbase %>.zip","plugins.json"]
path: ‘<%= myServer.path %><%= grunk.task.current.target %>/’,
srcBasePath: "../public/<%= grunk.task.current.target %>/",
},
},

Now that the prerelease and production path and scrBasePath variables are identical they can be moved into the top options section.

Now if I can just figure out how to have shared FILE configurations which define the current working directory (cwd), source (src), and destination (dest) file sets I could eliminate ALL of the settings in the production and prerelease configuration blocks and leave them with a simple “use the defaults setup like this:

// sftp
//
sftp: {
options: {
host: ‘<%= myServer.host %>’,
username: ‘<%= myServer.username %>’,
privateKey: ‘<%= myServer.privateKey %>’,
passphrase: ‘<%= myServer.passphrase %>’,
path: ‘<%= myServer.path %><%= grunk.task.current.target %>/’,
srcBasePath: "../public/<%= grunk.task.current.target %>/",
showProgress: true
},
production: { },
prerelease: { },
},

Maybe someday.

Simplify Congiguration With Shared Variables

Another trick I learned is that common configuration strings can be put at the top of the configuration block and re-used. This is alluded to on the Grunt tasks configuration page but they never expound on how to use the common configuration variables. Here is how it works, define the variable at the top of the configuration block then reference that variable inside a string like ‘<%= my_var %>’. Here is my example with some “fluff” missing from the middle:

// Project configuration.
grunt.initConfig({

// Metadata.
currentPlugin: currentPlugin,
myServer: myServer,
pkg: grunt.file.readJSON(‘package.json’),
my_plugin_dir: ‘/var/www/wpslp/wp-content/plugins/<%= currentPlugin.slug %>’,
my_src_files: [
‘**’,
‘!**.neon’,
‘!**.md’,
‘!assets/**’,
‘!nbproject/**’
],
// compress
//
compress: {
options: {
mode: ‘zip’,
archive: ‘../public/<%= grunt.task.current.target %>/<%= currentPlugin.zipbase %>.zip’,
},
prerelease: { expand: true, cwd: ‘<%= my_plugin_dir %>’, src: ‘<%= my_src_files %>’ },
production: { expand: true, cwd: ‘<%= my_plugin_dir %>’, src: ‘<%= my_src_files %>’ },
},

In this example you can see how I’m using the my_plugin_dir variable to set my path to the plugins I am working on on my dev box and my_src_files to list the files I want to add (or ignore) when pushing my development plugin directories to a zip file or the WordPress svn repo for publication.

This has simplified a lot of task configuration entries in my custom grunt tasks script.

That combined with smarter configuration blocks in areas like the SFTP node module has simplified my Grunt configuration which will make it less prone to errors and easier to maintain going forward.

Back to coding…

Posted on

WordPress Dev Kit Plugin : 0.0.1

Grunt WordPress Dev Kit

For those of you that have been following along with my exploration of Grunt and automating my plugin workflow automation… I’m sorry.   Quite boring I’m sure but remember this blog is my personal notebook as much as fodder for the 3 people that may be interested in this stuff.

Last night I extended my journey toward less manual updates, each new step adding an option for human error, by creating the first WordPress Development Kit Plugin.  Yup, a plugin for plugin development.  Sort of.  What the new plugin is going to help me with is keeping my Plugin Version Info page updated.   Today it is very rudimentary with a basic list of plugin slugs, the version info, and release dates.  You can see it in action on my Plugin Version Info page. Ultimately the new Plugin companion to the WordPress Development Kit will be extended to get more information from the Grunt process into the public website via the automated toolkit.

My goal is to have ONE PLACE where plugin information is updated, preferably in the readme files.    For now the JSON file that drive Grunt will suffice with future plans to scrape the readme data into the plugins.

What WordPress Dev Kit Plugin Does

The WordPress Dev Kit plugin is very simplistic in its current form.  It reads the plugins.json file from the WP Dev Kit and renders information via a shortcode to a plugin or  post.

Version 0.0.3 of the plugin has the following shortcodes available:
* Actions (default: list)
* o [wpdevkit action='list'] list details about all plugins
*
* Styles (default: formatted)
* o [wpdevkit action='list' style='formatted'] list the details in an HTML formatted layout
* o [wpdevkit action='list' style='raw'] list the details in a print_r raw format
*
* Types (default: basic)
* o [wpdevkit action='list' type='basic'] list basic details = version, updated, directory, wp versions
* o [wpdevkit action='list' type='detailed'] list all details = version, updated, directory, wp versions, description
*
* Slug (default: none = list ALL)
* o [wpdevkit action='list' slug='wordpress-dev-kit-plugin'] list details about a specific plugin

This will list the entire plugin metadata structure in a pre tag as a standard PHP dump format.

WP Dev Kit Plugin Setup

It expects that you have the WordPress Development Kit plugins.json file pushed to a production files directory on your server.   You can set the location of the file to any web-server-readable directory.    The location is specified in the Settings / WP Dev Kit menu in the admin panel.

If you are using the WP Dev Kit Grunt tasks the build:<slug>:production process will move the plugins.json file over to your server along with your production.zip files.

My Process And How This Helps

In case you’re wondering why I would go through the trouble of building a full-fledged plugin like this, here is my typical workflow:

  1. Edit code.
  2. Edit readme to update version, features, etc.
  3. Edit main plugin to update version, etc.
  4. Create zip file.
  5. FTP zip file to server.
  6. If a WordPress Plugin Directory plugin, fetch SVN repo, update trunk, commit, add branch, commit, push.
  7. Login to my website.
  8. Update version info page on website.
  9. Create blog post about new plugin update with features and screen shots if warranted.
  10. If a Store Locator Plus plugin update the HTML that is pushed through the in-product signage app.

Until today nearly every step of that process was manual.  Now with the WP Dev Kit running on my system and the WP Dev Kit Plugin on my public site the process is now:

  1. Edit code.
  2. Edit readme to update version, features, etc.
  3. Edit main plugin to update version, etc.
  4. Edit the WP Dev Kit JSON file.
  5. grunt build:<slug>:production  which automatically
    1. checks that the version in the readme.txt and plugin file match (new quality control test)
    2. creates the zip file
    3. uses SFTP to put the file on my server
    4. updates the WordPress Plugin Directory Listings (fetch, update trunk, commit, add branch, commit, push)
    5. pushes plugin.json which talks to the WP Dev Kit Plugin and keeps the version info page upated
  6. Login to my website.
  7. Create blog post about new plugin update with features and screen shots if warranted.
  8. If a Store Locator Plus plugin update the HTML that is pushed through the in-product signage app.

The magic happens in steps 4 and 5.   It automates many of the steps in the process.  As I refine the WordPress Dev Kit I will be able to eliminate more steps along the way.

Not a bad start.   As each new plugin update happens I will be refining and improving the automation files and plugin to create a better presentation and improve quality control at each step.

Learn about this process via the WordPress Development Kit articles.

Posted on

WordPress Dev Kit : Grunt Helpers 0.2.0

Grunt WordPress Dev Kit

I am continuing on my quest to use Grunt to further automate my plugin development process. I hope to be generating better quality plugins through more automated testing and sanity checks during the production process. I also hope to take out a few of the steps required to get the plugins into the hands of my customer base. TheWordPress Development Kit articles describe the process as it evolves, both to possibly help others that are considering automating with Grunt and to help people that are starting to work on plugins related to those I’ve created understand my process.

My Environment For Grunt Helpers 0.2.0

I’ve made some changes to my plugin production environment since the original “Legacy Publisher” and even my “Grunt Helpers 0.1.0” article.   Some of the changes are based on things I’ve seen elsewhere as I learn about Grunt and other changes are to have a better defined environment.   Here is a summary of how things are working with the Grunt automation in the latest iteration.

I have two types of production, ready for public consumption, plugins:

WordPress Plugin Directory Listings

The WordPress Plugin Directory Listings (WordPress Hosted) are managed via the standard subversion (svn) repository update process.   They also are packages and put on my server for customers to download from my site.

Premium Plugins Served From My Site

The Premium Plugins (Premium) are only hosted on my server and made available to customers that have purchased the premium add-on packs.

For both types of plugins, whether WordPress Hosted or Premium, I now have two versions available; the production release (production) which is the official ready-for-deployment version and pre-release (prerelease) which is provided to select customers for testing an upcoming release before it has been fully tested.   Pre-release versions are beta releases that may not be fully tested.

Regardless of whether a plugin is a WordPress Hosted product or a Premium product, the pre-release is ONLY available from my server and only to select customers.    I use the Groups plugin combined with the WooCommerce add-on pack for Groups as well as the File Away plugin to manage access to the pre-release versions.

The Grunt Tasks

To support the various iterations of the plugins that are being produced I have created several grunt tasks that are managed by my WordPress Development Kit scripts.

grunt build:<slug>:production

This task will publish the production-ready copy of my plugin.   It goes through my development copy of the plugin directory, cleans out the “development cruft” and builds a .zip file that is ready for distribution.    It will then read the plugins.json file and determine the ultimate home for the production copy.  If the “publishto” property for the given slug is “wordpress” (maybe I should change it to “WordPress”) the product ends up on the WordPress Plugin Directory.    If the publishto property is “myserver” the product ends up on my web server in the production files directory.

grunt build:<slug>:prerelease

This tasks also cleans out the “development cruft” of the plugin directory and creates a zip file.   However in this mode the plugin zip file only ends up on my server regardless of the publishto property set in the plugins.json file.   The prerelease files are stored in a separate directory on my server from the production files.

My Grunt Configuration Files

There are now several files that are used to configure my development kit.

package.json

This now follows a standard Grunt package.json format.   It contains my author info, the grunt default project name, version, description, license, and a list of grunt dependencies.   Grunt dependencies are node.js modules that are used by this project.  Currently the list includes:

Those with the asterisk (*) are not currently used, but I know I will be making use of them in the future so I’ve left those recommended default modules in place.

plugins.json

This is now the home for all of the metadata about my WordPress plugins that are being managed by the WordPress Development Kit.  This includes variables that are later used by modules such as wp-deploy as well as my plugin specific information such as the slug and other information.   The format is typical JSON with the following nodes defined:

    • pluginMeta = an array of plugin slugs
      • <slug>
        • version = current production version of the plugin (most likely will be deprecated and auto-read from readme.txt)
        • name = the plugin name (most likely will be deprecated and auto-read from readme.txt)
        • description = the plugin description (most likely will be deprecated and auto-read from readme.txt)
        • publishto = where production files get published, either “wordpress” (WordPress Hosted) or “myserver” (Premium)
        • zipbase = the base name of the zip file, used if I want to create myplugin-test.zip instead of myplugin.zip
        • reposlug = the WordPress Hosted repository slug if not the same as <slug>
    • wp-plugin-dir = where on this server, my development server, do the in-development plugins live
    • wp-username = my WordPress Plugin Directory username

myserver.json

This file contains the details about my server that help with the SFTP push to my server.  It includes things like my local path to the SSH key files and where I want to put stuff on the remote server.

    • host = my fully qualified host name (no http, etc. just a pure host name)
    • username = the username to use for sftp login
    • privateKeyFile = the path on my development system where my private key file lives
    • privateKeyPassFile = the path on my development system where the private key password is stored (will become a command line option, this is a security risk, though my dev system is fairly well locked down and isolated)
    • path = the path on the production server that is the root directory under which my production and pre-release files will go.

Process Details

Here is what happens when I execute the two most-used commands in my new build kit.

WordPress Hosted Production

command: grunt build:<slug>:production

This process gathers all the files in my plugin directory and copies the “cleaned” version of the files into ./assets/build under the plugin directory itself.  I store it here because assets is one of the directories that is ignored (cleaned) during production and it allows me to see exactly what got moved into production.    These files are then published to the WordPress svn repository using the wp-deploy module.   The same files are sent through compress with a .zip file created and stored in the wp-dev-kit/public directory (yes this seems redundant and can probably be simplified).  Finally the zip file from that public directory is sent over to my live server with SFTP and put into the production folder on the live server.

[box type=”alert”]I did have to create a patch for wp-deploy to properly manage subdirectories. I have submitted the patch to the author. Hopefully it makes it into his production release soon.[/box]

Premium Production

command: grunt build:<slug>:production

This process is identical to the WordPress Hosted Production process with one exception.  It skips the wp-deploy call and does nothing with svn.  I creates the ./assets/build files, zips them, puts them in the WP Dev Kit public folder, and copies them over to the production folder on my server.

WordPress Hosted and Premium Prerelease

command: grunt build:<slug>:production

This process works much like the Premium Production process.  The only difference in the process is that the files are copied over to my server into a different directory that holds pre-release plugin files only.

Further Refinements

There are a number of refinements to be made to the process.  First on my agenda is reading the readme.txt file and extracting the version information so it can be appended to the zip file names for pre-release copies of the products.    I will then work on things like running the third party modules to do gettext sanity tests, JavaScript minification, and other “would be nice” features for production plugins.   I will also likely change this process a dozen times as I continue to iterate over both Premium and WordPress Hosted plugin builds over the next few months.

In the end I hope to have a handful of simple Grunt commands that manage the process efficiently and reduces the human error that my current process can introduce at several steps.

If you have suggestions or feedback on how to improve my process, please let me know!

Related Topics

Posted on

WordPress Dev Kit : Grunt Helpers 0.1.0

Grunt WordPress Dev Kit

Now that I have a very basic understanding of Grunt it is time to start incorporating it into my WordPress Development Kit.  My hope is not to simply replace the current Legacy Publisher elements of my WP-Dev-Kit scripts, but to speed up the production process and improve the quality of the plugins.

I can now start writing JavaScript to parse and process the myriad of files that are updated with each release.  I can mix in system commands and replace the Bash script processing to update repositories, upload files to my servers, and hopefully even start an automated web content production script.

WordPress-Centric Grunt Scripts

In addition there are at least a handful of pre-existing Grunt scripts that will improve the quality of my plugins doing things like creating the POT files and also checking my text domains (an ongoing problem for me) to assist in language translations.    There are also scripts to check the WP versions in my readme and base plugin match (bitten by that one more than once) and even generate a readme.md for Github or Bitbucket repos.    I’m sure I’ll discover more soon, here are the few I know about today:

checktextdomain by Stephen Harris
Checks the correct text domain is passed when using the WordPress translation functions.

checkwpversion by Stephen Harris
Make sure your plugin version numbers are all in sync in the plugin header, readme.txt and the Grunt package.json files.

pot by Stephen Harris
Generates a .POT file that can be used for translations.

wp-readme-to-markdown by Stephen Harris
Converts readme.txt file to readme.md for use in Github repo.

Setting Up The WP Dev Kit

You can follow along by cloning the git repository:

cd ~
git clone git@bitbucket.org:lance_cleveland/wp-dev-kit.git ./wp-dev-kit
cd ./wp-dev-kit/grunt
npm install

[box type=”alert” style=”rounded”]The WP Dev Kit Legacy Publisher scripts store published plugins in your ~/myplugins directory. With the newer Grunt scripts they are stored under the ./wp-dev-kit/public directories. If you mix-and-match legacy and Grunt scripts you will have inconsistent published zip files unless you symlink the directories.[/box]

After cloning the repository you will want to edit the plugins.json file in the ./grunt subdirectory to replace the plugin files with your own list of plugins that you are developing and/or managing.

[box type=”alert” style=”rounded”]Make sure you edit the ./grunt/plugins.json file to match the list of plugins you are developing.[/box]

If you are feeling adventurous you can even use git to clone my public Store Locator Plus plugin code into your WordPress plugins directory.

My WP Dev Kit Grunt Tasks

The following tasks are part of my my WP-Dev-Kit Grunt file.

build:<slug>[:<type>]

Builds the distribution file, a .zip file, for the specified plugin and put it in the wp-dev-kit public folder.   Ready for distribution.  As the Grunt toolkit expands this will also do some cleanup and maintenance like running version check and textdomain checks, minifying CSS and JavaScript and other nice things like that!

The type parameter can extend the build and make it do other things like:

makezip
The default mode.  Just build the zip file and put it in the public folder of the WP dev kit.

publish
Reads the package.json parameters and decides whether to SFTP the zip file to a private server or unpack it into a new tag in the svn repo and push it to the public WordPress Plugin Directory.   This will also push the zip file and plugins.json file to the server specified in myserver to allow for integration with some new WordPress plugins that are coming to publish plugin version lists and other “nice things” on a WordPress plugin store website like StoreLocatorPlus.com.

Set the package.json pluginMeta slug property ‘publish-to’ to either ‘wordpress’ or ‘myserver’ for each plugin being managed.

list

This will list the slugs for the plugins you are managing with the Grunt portion of the wp-dev-kit.

details

This will list the plugin names, slugs, and current version of the WordPress plugins being managed by the dev kit.

Configuring The Grunt Tasks

There are two files that need to be edited in the latest version of my WP Dev Kit.  The plugin details file, plugins.json, and the public plugin server details myserver.json file.

plugins.json

The plugins details file contains the settings for the WordPress plugins that you will manage with the WP Dev Kit.  The settings are as follows:

wp-plugin-dir – where is your WordPress plugin directory on this server?

pluginMeta – an array of plugin slugs with sub-keys that specify details about the plugin such as:

version – the current production version in major.minor.patch format.

zipbase – the name of the zip file to be created, usually the same as the slug .

publishto – where to send the production files when done, which can be one of the following:

wordpress = put it in the WordPress Plugin Directory via the svn repository

myserver = put it on the server specified in the myserver.json file using SFTP

name – the name of the plugin, usually as it appears in the readme.txt for the plugin

description – the description for the plugin, usually as it appears in the readme.txt

{
  "pluginMeta": {
    "slp-enhanced-results": {
      "version": "4.1.03",
      "zipbase": "slp4-er",
      "publishto" : "myserver",
      "name": "Store Locator Plus : Enhanced Results",
      "description": "A premium add-on pack for Store Locator Plus that adds enhanced search results to the plugin."
    },
    "store-locator-le": {
      "version": "4.1.10",
      "zipbase": "slp4",
      "publishto" : "wordpress",
      "name": "Store Locator Plus",
      "description": "Manage multiple locations with ease. Map stores or other points of interest with ease via Google Maps.  This is a highly customizable, easily expandable, enterprise-class location management system."
    }
  },
  "wp-plugin-dir": "/var/www/wpslp/wp-content/plugins/"
}

myserver.json

The connection details for the SFTP server used in the for myserver zip file publication as noted in the plugins.json description.   Settings include:

host – the fully qualified host name for the server

username – which user to login as

privateKeyFile – the path to the private key file, usually in your home/.ssh directory named id_rsa.

privateKeyPassFile – the path to a secured text file, that should NOT be part of any public repository , that has your private key file passphrase.

path – the path on the server where the files should be stored, relative to the username login directory.

{
    "host": "www.charlestonsw.com",
    "username": "serveradmin",
    "privateKeyFile": "/home/myuser/.ssh/id_rsa",
    "privateKeyPassFile": "/home/myuser/private/.pkpass",
    "path": "/var/www/wpslp/production/premium/"
}

That’s enough for this round of Grunt automated WordPress plugin publishing. I can see how this is going to help improve my process in the future even though it is adding an extra step today. I will continue to update this development kit for my production environment. As I do I will add more posts like this one and describe how I manage my free Store Locator Plus plugin and premium add-on packs.

Posted on

What I Learned About Grunt Compress

Grunt Configuring Tasks Banner

I learned a few interesting things while setting up my WordPress Development Kit project with some new Grunt tasks.   These are my findings while getting some custom build tasks working with the Grunt compress plugin (grunt-contrib-compress).

You can setup the default compress mode to be zip by using the main.options attribute in the JSON config parameters.

You can tell Compress where to put the final zip file and what to name it using dynamic elements, such as the plugin slug as the base file name by using an anonymous function in the archive property.

Use the files property to set an array of files to be created.   In each element of the array you specify attributes such as what files are to go INTO the zip file, where they are stored in the zip file, whether or not to use relative paths and more.    This is where things were a bit confusing for me so I’ll expand on that in a moment.

Here is my starting Grunt Compress settings that I am working with.  I will explain what this does below:

    compress: {
        main: {
            options: {
                mode: 'zip',
                archive: function() {
                    return slug + '.zip';
                }
            },
            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/',
                    src: ['*'],
                    dest: 'public/',
                }
            ]
        },
    },

Here is a snippet of code that goes with the above configuration to do something.

  /**
   * makezip
   *
   * Build the zip file for the plugin, avoiding the assets directory, .git folders, and .neon files.
   */
  grunt.registerTask('makezip', 'Make a zip file for the plugin.', function(slug){
      grunt.log.writeln('Placeholder for makezip.  Slug name is ' + slug);
      global.slug = slug;
      grunt.task.run('compress');
  });

What Is In The Above Configuration

One of the first important things to note is that Grunt has a fairly robust built-in file manager. This file manager is available to all tasks and allows task files to use a default set of file rules such as the cwd, expand, src, and dest properties you see in the configuration section above. The Files section of the Grunt Configuring Tasks page will provide more insight beyond what I describe below.

archive

In the example above this is an anonymous function. The global variable “slug” is set in the makezip task and this is used to create the final zip file name. In my case it will be the WP Plugin Slug and .zip such as store-locator-le.zip for my Store Locator Plus plugin.

files.expand

The expand property tells the Grunt file processor to do dynamic source-and-destination processing.

files.cwd

Instructs the current processor, Compress in this case, to strip out the fully qualified path and make all file names in the processing loop relative to the parameter in the cwd command.  In my case it will make all files relative to my WordPress plugin root directory /var/www/wpslp/wp-content/plugins/.

file.src

This tells Compress which files are to be included in this round of processing.  For Compress it is the files that will be included in the .zip file distribution.   It uses the rules of something called minimatch as a file pattern matching system.  minimatch will grab as FEW files as possible so the ‘*’ rule here works different that typical operating-system wildcard file listings.   It will ONLY match the files in the exact directory specified.    In my case only the FILES (no directories or subdirectories) that are in my wp-content/plugins directory which in my case is only grabbing my legacy publisher scripts I put in the WP plugins directory on my dev box (blah, what a bad design).    I will explain how I fix this later.

file.dest

This one kind of threw me.   You can see I put public/ in as my destination.   I THOUGHT it would put the resulting .zip file in a folder named public under my current Grunt working directory with a <slug>.zip file in it.   WRONG.   What this does is tells compress where inside the resulting zip file you want the files it “gathers” with the file.src pattern noted above.

In the setup above it created a file in the ROOT grunt directory named store-locator-le.zip.   Inside that zip file is a folder named “public” in which all the contents of my WP Plugin directory (base files only) reside.  NOT what I wanted!

My Grunt Compress Mess
My Grunt Compress Mess

Fixing The Initial Grunt Problems

The first thing to fix is getting the .zip file to go to the ./wp-dev-kit/public folder where I will fetch it with other tasks for publication to the WordPress public plugin directory or to my server for private access for premium add-on packs.   There are two items to fix: files.dest and the archive path.

Removing the dest: property from my files section solved the first issue.   Now the files that match the src specification will go into the top-level of the .zip that is created.

Adding ../public/ to the start of my anonymous archive function will store the files in my public folder which resides next to the running Grunt tasks folder.

First two relatively minor issues are fixed, but there are deeper issues to resolve, specifically getting the files in the specified plugin directory and then adding some methods to ignore the files I don’t want part of the kit.

    compress: {
        main: {
            options: {
                mode: 'zip',
                archive: function() {
                    return '../public/' + slug + '.zip';
                }
            },
            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/',
                    src: ['*'],
                }
            ]
        },
    },

Step 2 – Only Get The Specified Plugin

Getting the specified plugin directory wasn’t difficult, but it did involve a bit of Google and learning about named variable configuration in Grunt and how to get them into my “variable space” so I can use the <%= varname %> syntax in my Compress settings.    Luckily Chris Wren wrote a nice Grunt related article for newbs such as myself.

First step, add the variable declaration at the top of the Gruntfile, right above grunt.initConfig and inside the module.exports.

Grunt currentPlugin "global"
Grunt currentPlugin “global”

With that in place I can now tell the Compress plugin to make all file processing relative the plugin slug directory and use the same variable to set my zip file base name:

    compress: {
        main: {
            options: {
                mode: 'zip',
                archive: function() {
                    return '../public/' + currentPlugin.slug + '.zip';
                }
            },
            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/<%= currentPlugin.slug %>',
                    src: ['**'],
                }
            ]
        },
    },

Inside my makezip tasks I now set the currentPlugin variable properties as opposed to a generic global variable, which is what I really wanted to do in the first place:

  /**
   * makezip
   *
   * Build the zip file for the plugin, avoiding the assets directory, .git folders, and .neon files.
   */
  grunt.registerTask('makezip', 'Make a zip file for the plugin.', function(slug){
      grunt.log.writeln('Placeholder for makezip.  Slug name is ' + slug);
      currentPlugin.slug = slug;
      grunt.task.run('compress');
  });

Now I am only getting the files for the specified plugin and not the WordPress plugin directory root files.   While I’m in there I also chanced the src parameter to ** versus *.   ** will grab all files in the current directory and any sub-directories that are part of my plugin.

Excluding Folders and Files

The last step will be to filter out those files I don’t want to include per my “no assets directory”, “no .git”, “no nbproject” and no “apigen.neon” files or folders.   If you follow my WordPress work flow posts you’ll know that this is my development environment and I prefer to work “inline” with the plugin directory and clear out the “cruft” of development files in the production cycle.

Thankfully the Grunt file processor makes excluding files a simple task.   I extend the src property with some “do not include” settings like so:

            files: [
                {
                    expand: true,
                    cwd: '/var/www/wpslp/wp-content/plugins/<%= currentPlugin.slug %>',
                    src: [
                        '**',
                        '!**.neon',
                        '!**.md',
                        '!assets/**',
                        '!nbproject/**'
                    ],
                }
            ]

That source specification will get all files in the main and sub-directories of my plugin EXCEPT for anything ending in .neon, .md or anything in the assets or nbproject sub-directories. By default the file filter will ignore any files starting with a dot, such as my .git, .gitignore, and .gitmodules folders and files.

Sweet!

I’m already liking Grunt WAY more than writing Bash files!

Follow along with other blog posts about the WordPress Workflow and WordPress Development Kit.

Posted on

Preparing A WordPress Plugin for Grunt Automation

Grunt Getting Started Banner

Thanks to my experience at WordCamp Atlanta (#wcatl), I’ve spent the past couple of days learning about Vagrant and how I can use it to build and distribute new VirtualBox systems to my developer-team-in-training.   I will refine that process to get new members of the development team setup with a CSA Standard environment to bring them up to speed with my process flow with less effort.

Today I am starting with another project inspired by my trip to Atlanta, using Grunt to automate my plugin building experience.   In this article I will go through my setup and initial project automation experience as I learn more about what Grunt can do for me,  getting it setup, and how I can use it to automate at least one of  the many steps involved in the final production phase of a WordPress plugin.

My Environment

My WordPress plugin development environment is fully contained within a virtual Linux workstation running on a laptop.   My current setup:

  • CentOS 6.5 with a full GUI desktop
  • Apache 2.x
  • PHP 5.4.23
  • MySQL 5.5.35
  • NetBeans 8.0 RC1
  • SmartGit 3.0.11 (on top of git 1.7.1)
  • WordPress 3.8.1 (soon to be update to latest nightly build for 3.9)
  • Firefox
  • Selenium IDE

My Process

The basic outline of a premium add-on pack production cycle follows a basic routine:

  • Code Cycle
    • Edit code in NetBeans writing directly to the local wp-content/plugins directory.
    • Commit changes via SmartGit to the current development branch.
    • Push changes with SmartGit to BitBucket when ready.
  • Test Cycle
    • sudo mysql < wpreset.sql to reset the WordPress database (blasts all tables)
    • start Firefox and open Selenium IDE
    • run the New WP Install Selenium IDE script
    • run some of the base Store Locator Plus data test scripts
    • for add-on packs run the add-on pack specific test scripts
    • Edit/Commit/Push/Repeat
  • Production Cycle
    • Validate the readme.txt file.
    • Publish a blog post about the update and new features.
    • Edit the CSA Server Update System with new version/date information.
    • Edit the CSA Version Information page.
    • Update the News and Info “signage”.
    • Package the plugin .zip file.
    • Publish the .zip file to the CSA servers.
    • If it is a WordPress Plugin Directory listing, update the svn repo to publish to WordPress.

As you can imagine, there are a lot of steps in the final production cycle that can be automated.    I will also be exploring phpUnit testing for my plugins to provide deeper testing of the plugins that can be automated to be a virtually hands-off test system, but that is a project for later.  For now, I need to learn Grunt and start replacing my useful but less-flexible bash scripts to simplify the final Production Cycle.

Installing Grunt

One of the first things I learned about Grunt is that it runs on node.js and I need the Node Package Manager (npm) to get it working.   On CentOS this is fairly easy.    I open a terminal, execute sudo, and install npm.    It brings the rest of the Node.JS stuff with it as dependencies.   When that is completed you can install grunt-cli and grunt-init via npm. Apparently I am going to want something called “grunt init templates” as part of the Grunt Scaffolding setup, so I will also use git to clone one of the official “bare bones” templates into my Linux user home directory.

NPM is part of the Extra Packages for Enterprise Linux (epel) repository.   You will need to install this before the install npm command will work.   If you are using a stock CentOS 6.5 release you can go to the following URL and click on the package link, open the download with package installer, and the epel yum repository will be installed and activated:

http://mirrors.severcentral.net/fedora/epel/6/i386/repoview/epel-release.html

$ sudo yum install npm
$ sudo npm install -g grunt-cli
$ sudo npm install -g grunt-init
$ git clone https://github.com/gruntjs/grunt-init-gruntfile.git ~/.grunt-init/gruntfile

With my initial install on CentOS 6.5 there were multiple “unmet dependency” errors followed by a “but will load” message. Running the command grunt seems to be pulling up the grunt CLI. For now I am going to assume those errors are not important for getting my first simplified Grunt tasks running.

Grunt CLI Install Warnings
Grunt CLI install warnings.

Adding Grunt To My Plugin Project

For years I’ve been using a series of Bash scripts to help manage my distribution.   One of the scripts that I use in every production cycle is a script named makezip.sh.    This script packages up the plugin subdirectory I specify skipping various files and directories (.git, the assets subdirectory and a few others) and creates a .zip file in a directory “far away from” the running WordPress install.  I can opt to send copies to the live server when they are ready for publication or keep them local for manual distribution and/or testing.   I bring this up because it impacts my first Grunt setup on my Enhanced Results premium add-on for Store Locator Plus.

I already use the assets sub-directory within the wp-content/<plugin> directory to store all of my development and production scripts and related assets.    As such I already have a place, the assets sub-directory within the plugin directory, where I should be able to store my Grunt configuration and script files without impacting the plugin distribution.

[box type=”note” style=”rounded”]My Dev Environment: The ./assets directory under the current plugin directory is NOT distributed during production.[/box]

To get started I go to my plugin sub-directory on my development system and create a grunt folder and the starting assets via the grunt-init template loaded with git clone as noted above.  After getting the template installed I run npm init to fetch the “helpers” for Grunt.  The helpers are node modules, aka Grunt plugins, that will be referenced by the Gruntfile.js execution.

cd ./wp-content/slp-enhanced-results
mkdir assets
cd assets
echo '<!--?php // Silence is golden.' --> index.php
mkdir grunt
cd grunt
echo '<!--?php // Silence is golden.' --> index.php
# get a basic package.json and Grunfile.js in place
grunt-init gruntfile
# set some defaults in package.json
npm init
# installs the modules specified in the package.json
npm install

[box type=”note” style=”rounded”]What are those echo commands? They create an index.php file to prevent browsing of the directories from a web interface. They are there as an extra safety measure in case the assets directory gets published.[/box]

With the gruntfile template I tell it that I am not using the DOM but will want to concatenate and minify files and that I will be using a package.json file at some point.

Grunt Gruntfile Template Setup
Grunt gruntfile template setup for my add-on pack.

Running this command puts the Gruntfile.js and package.json files in my ./assets/grunt folder and gives me access to the basic scripting tools necessary to start a grunt project.

I think I’m ready for some automation!

Gruntfile Defaults

Earlier I ran the the grunt-init gruntfile step to setup a default starter environment for Grunt.   Time to dig into the details of the two install files.   Some basic reading tells me that the package.json file tells Grunt which “helpers” are to available to this project by default including their version numbers:

Grunt “Helpers”

Helpers are officially termed “plugins” in the Grunt world.   I call them helpers at this stage to remind me that they help perform tasks within my project but I’ll still need to guide them as to what to do.

The default “helpers” in package.json:

{
  "engines": {
    "node": ">= 0.10.0"
  },
  "devDependencies": {
    "grunt": "~0.4.2",
    "grunt-contrib-jshint": "~0.7.2",
    "grunt-contrib-watch": "~0.5.3",
    "grunt-contrib-nodeunit": "~0.2.2",
    "grunt-contrib-concat": "~0.3.0",
    "grunt-contrib-uglify": "~0.2.7"
  }
}

What are these helpers? The first couple of entries are obvious. The base JavaScript engine is node and the first “helper” is grunt. What are the rest?   They are all plugins from the grunt-contrib library which gives us some hints:

  • grunt-contrib-jshint

    Validate files with JSHint.   JSHint looks for issues in the syntax of JavaScript files.  With Grunt this happens BEFORE they are published if you keep this as a default task.

  • grunt-contrib-watch

    Run tasks whenever watched files change.  This watches files in your projects.  If a file changes, do something.

  • grunt-contrib-nodeunit

    Run Nodeunit unit tests. Allows node unit tests to be added to your project and run during a build cycle with Grunt.

  • grunt-contrib-concat

    Concatenate files.  Grab a list of files and concatenate them.

  • grunt-contrib-uglify

    Minify files with UglifyJS.  Create minimized JavaScript files from your source files.  Speeds up page load times by cutting out all of the non-executable parts of a JavaScript file including white space.

Grunt Commands and Execution

The other file that is created with the default template that I’ve used id the Gruntfile, stored as Gruntfile.js.   That is a pretty good hint that it is a JavaScript file.   Here is what it looks like:

/*global module:false*/
module.exports = function(grunt) {

  // Project configuration.
  grunt.initConfig({
    // Metadata.
    pkg: grunt.file.readJSON('package.json'),
    banner: '/*! <%= pkg.title || pkg.name %> - v<%= pkg.version %> - ' +
      '<%= grunt.template.today("yyyy-mm-dd") %>\n' +
      '<%= pkg.homepage ? "* " + pkg.homepage + "\\n" : "" %>' +
      '* Copyright (c) <%= grunt.template.today("yyyy") %> <%= pkg.author.name %>;' +
      ' Licensed <%= _.pluck(pkg.licenses, "type").join(", ") %> */\n',
    // Task configuration.
    concat: {
      options: {
        banner: '<%= banner %>',
        stripBanners: true
      },
      dist: {
        src: ['lib/<%= pkg.name %>.js'],
        dest: 'dist/<%= pkg.name %>.js'
      }
    },
    uglify: {
      options: {
        banner: '<%= banner %>'
      },
      dist: {
        src: '<%= concat.dist.dest %>',
        dest: 'dist/<%= pkg.name %>.min.js'
      }
    },
    jshint: {
      options: {
        curly: true,
        eqeqeq: true,
        immed: true,
        latedef: true,
        newcap: true,
        noarg: true,
        sub: true,
        undef: true,
        unused: true,
        boss: true,
        eqnull: true,
        globals: {}
      },
      gruntfile: {
        src: 'Gruntfile.js'
      },
      lib_test: {
        src: ['lib/**/*.js', 'test/**/*.js']
      }
    },
    nodeunit: {
      files: ['test/**/*_test.js']
    },
    watch: {
      gruntfile: {
        files: '<%= jshint.gruntfile.src %>',
        tasks: ['jshint:gruntfile']
      },
      lib_test: {
        files: '<%= jshint.lib_test.src %>',
        tasks: ['jshint:lib_test', 'nodeunit']
      }
    }
  });

  // These plugins provide necessary tasks.
  grunt.loadNpmTasks('grunt-contrib-concat');
  grunt.loadNpmTasks('grunt-contrib-uglify');
  grunt.loadNpmTasks('grunt-contrib-nodeunit');
  grunt.loadNpmTasks('grunt-contrib-jshint');
  grunt.loadNpmTasks('grunt-contrib-watch');

  // Default task.
  grunt.registerTask('default', ['jshint', 'nodeunit', 'concat', 'uglify']);

};

After sitting in on the Grunt session at WordCamp I know a couple of things about this file. The initConfig section sets the rules for the various “helpers” and way down near the bottom is a Default task. This is what does all of the work when I run Grunt in my project.

What is this going to do?

The default I have now is going to run jshint, which is going to look for any JavaScript files and scan them for syntax issues and other problems like unused variables (I can tell that by looking at the jshint section higher up in the code). It will then run any nodeunit tests by looking in test/**/ for any files ending in _test.js and execute them (I assume). It will concat any files that live in lib/ that end with .js and store them in dist/<pkg.name>.js.   Finally it will uglify… minify… any of the files that are stored in the destination directory defined by the concat section (dist/) and minify them.

Tweaking Gruntfile For Me

Looks like some decent defaults, but for my project I need to change some things.

I won’t run Node unit testing on this project.  In the Gruntfile I remove the nodeunit from the default tasks section and the config section above.   I also remove the package from the json file.

As noted above, the Grunt project directory on my setup is under the assets subdirectory for this plugin.   All of my main project files are up a couple of levels in the plugin parent directory.   Since I want to distribute both the original JS files AND the minified version, I need to change some paths.   I am going to concat and uglify the scripts in the main plugin distribution directories and output the minified versions there.   I need to change any paths in the upper “config part” of the Gruntfile.js:

Update the concat section:

    // Task configuration.
    concat: {
      options: {
        banner: '<%= banner %>',
        stripBanners: true
      },
      dist: {
        src: ['../../js/<%= pkg.name %>.js'],
        dest: '../../js/<%= pkg.name %>.concat.js'
      }
    },

And the JSHint section:

    jshint: {
      options: {
        curly: true,
        eqeqeq: true,
        immed: true,
        latedef: true,
        newcap: true,
        noarg: true,
        sub: true,
        undef: true,
        unused: true,
        boss: true,
        eqnull: true,
        globals: {}
      },
      gruntfile: {
        src: 'Gruntfile.js'
      },
      lib_test: {
        src: ['../../js/*.js', '../../js/test/**/*.js']
      }
    },

That will ensure that all JavaScript stuff goes in my standard ./js subdirectory in my plugin. Yes, the users will get those files making for a larger zip file download and more disk space on their server. Disk space is cheap and download speeds are decent in most places. Not too mention zip is pretty darn good at compressing JavaScript files. When my plugin executes it will load the concatenated minified file which gives the full benefits of execution speed and reduced RAM footprints on the server and users browser. This keeps the original source available to my user base so they can read the code and hack functionality if they find the need without wading through obsfucated minified concatenated JavaScript.

I’ve also learned that I need to add more details to the package.json file in order to get the default rules and tools to work.  This includes adding the name, version, and author variables to package.json.   If you do not define all 3, INCLUDING AUTHOR, you will get the following error:

Running "concat:dist" (concat) task
Warning: An error occurred while processing a template (Cannot read property 'name' of undefined). Use --force to continue.

Adding the name, version, and author elements to the default package.json results in:

{
  "name": "slp-enhanced-results",
  "version": "4.1.01",
  "author": "csa",
  "engines": {
    "node": ">= 0.10.0"
  },
  "devDependencies": {
    "grunt": "~0.4.2",
    "grunt-contrib-jshint": "~0.7.2",
    "grunt-contrib-watch": "~0.5.3",
    "grunt-contrib-concat": "~0.3.0",
    "grunt-contrib-uglify": "~0.2.7"
  }
}

Now I an run the grunt command which will execute jshint, concat, and uglify on all my JavaScript files. Currently the Grunt configuration outputs some headers so I know it is thinking about doing something, but I don’t have anything interesting to process yet. But I will soon. That will be content for the next article about automating my WordPress workflow.

First Grunt Run - No Errors
First Grunt Run without errors.
Posted on

WordPress Dev Kit : Legacy Publisher

WordPress Plugin Directory Banner

This article is about my legacy publishing process for development WordPress plugins and themes.   It is my workflow and it is far from perfect.  It will evolve over time and I am in the midst of making several big changes using new tools to assist in the process.   To get some background on the process you may want to follow along with the WordPress Workflow and WordPress Development Kit threads on my blog.

My Setup

My WordPress development is fully contained in a VirtualBox running CentOS 6.5 with a full GUI interface.   I am working on building an distributing some Vagrant boxes that will help others replicate the process.  My base CentOS 6.5 GUI box is online but is not a full setup… yet.   You can read about Vagrant in the WordPress Workflow thread.

My plugin development environment consists of a standard WordPress production build (3.8.1 today) with the typical wp-content/plugins subdirectory.  On my development system I work directly inside of these directories.  It provides instant access to the modified code without running a repetitive series of push/pull commands to move stuff into a “ready to review” state.    I find this configuration to be far more efficient and rely on semi-intelligent scripts to help me “weed out” the development environment when building the production-ready kits.    As a last step before production I have a test box on a remote server where I install the production releases as a clean last-minute sanity check.  This will migrate to a Vagrant-based “clean production simulator” to provide more robust testing.

How does my “inline production” setup look?

The Build Scripts

The top-level build scripts go in my ./wp-content/plugins directory.   That is not exactly accurate.   I should say that symbolically linked file pointers (symlinks for Linux geeks… or shortcuts in Windows lingo) go in that directory.   This includes the makezip, update_wrepo, exclude and common files that are part of the legacy/publisher section of the new WP Dev Kit I’ve published on Bitbucket.

Plugin Assets

Within each plugin subdirectory I create an assets subdirectory.  This is where my “build and production helpers” go.   My scripts specifically avoid this directory when packaging the “for consumption” products.  Within this directory I currently store things like raw image assets that typically get pushed the WordPress Plugin Directory, a full copy of the WordPress svn repository for any plugins published there, and other “goodies” I need for production that I do not distribute.  This is where my new Grunt and other build helpers will go.

Published Files Store

There is a separate directory under my home directory, ~/myplugins, where the published files go.   This helps keep my production files separate from the main plugins.  This will eventually change as I improve and streamline the process.

NetBeans “Push” Project

I also have a special directory under my home directory where I keep non-WordPress NetBeans projects.   One of those projects is a private project named “csa_licman”.  One of the many things this project does, which will also be changing, is sync any changes I make in the subdirectories of the csa_licman project up to my live server.  I use this to “publish” my premium add-ons to my live server.    I will not cover that part of the project here or the NetBeans automated FTP push setup.

You can find details on getting NetBeans projects to watch a directory and FTP files elsewhere on the NetBeans site.  I will be changing this complex part of the system to automated Grunt processing in a later revision to the WP Dev Kit.    For now you can keep the myplugins directory and manually FTP files from there to any private directories.

Keep in mind that anything going to the public WordPress Plugin Directory will be handled via the legacy update_wprepo.sh script.   The NetBeans push project is only for my premium add-on packs that are not hosted in the WP Plugin Directory.

My tool kit currently consists of:

git – for managing my code repository.  I find it faster and lighter than svn.

svn – solely for publishing to the WP Plugin Directory.

smartgit – my GUI to help me cheat with git commands, I am MUCH faster with the GUI.

NetBeans – my IDE setup specifically for PHP development with heavy use of phpDoc hints in my code.

Firefox with Firebug – my test and development browser because Firebug is the easiest-to-use development tool set.

command line with Bash – the Linux command line for executing my Bash scripts, most of this should work on OSX as well.

Using The Legacy Publisher

Here is how I setup and use the legacy publisher scripts in my setup.   I call them “legacy publisher” scripts as I am learning Grunt and working toward a more intelligent processor.  The legacy publisher is based on simple Bash scripts which can be complex to configure and extend.   Grunt provides a managed system that will fetch various plugins to do a lot of cool stuff that would take me far to long to replicate in Bash.    Why re-invent the wheel?

After I get my new dev box setup and I have WordPress up-and-running and my development plugins in place under the wp-content plugins directory I attach my legacy publisher kit.   I am not going to cover the NetBeans push setup under csa_licman though you will see it referenced in the shell scripts.   Always answer “n” to “publish to live server” when running any of the legacy scripts and you will not have issues as that part of the script will not be executed.

Under my login I use git to fetch the WP Dev Kit from Bitbucket,  setup my published files directories, I then link the legacy publisher kits into my WordPress plugin directory:

cd ~
mkdir myplugins
git clone git@bitbucket.org:lance_cleveland/wp-dev-kit.git ./wp-dev-kit
cd /var/www/wpslp/wp-content/plugins
ln -s ~/wp-dev-kit/legacy/publisher/*

I am now ready to use the legacy kit to build my production zip files and publish them by hand to my server or use update_wprepo.sh to push them to my WordPress Plugin Directory listing.

Simple Self-Hosted Plugins

My premium add-on plugins are the simplest setup for this environment.   Once I have my legacy publisher scripts in place I can create my base plugin files right in the plugin directory.   I edit my readme and php files, test locally, and when ready I publish them in one step with the makezip command.  Since I have a NetBeans automated push project I can answer “y” to the publish to live server question and get the files over to my server.  If you do not have this setup you can answer no and manually FTP the files in ~/myplugins to your final destination.

cd /var/www/wpslp/wp-content/plugins
./makezip slp-enhanced-results
Legacy Publisher Makezip Example
Using legacy publisher scripts to create a production zip file.

WordPress Hosted Plugins

WordPress hosted plugins, such as the Store Locator Plus plugin, require a bit of extra work to get them ready for production.    After the initial setup is in place I can update the production release using the update_wprepo.sh script.    In this setup I use the assets subdirectory to assist in the production.   It is where the svn repository lives that gets updates to the WordPress servers. 

Store Locator Plus Plugin Development Folders
Store Locator Plus plugin development folders.

I start with my plugin directory in place and build the setup necessary for the legacy script to publish to WordPress:

cd /var/www/wpslp/wp-content/plugins
cd store-locator-le
mkdir assets
echo '<?php // Silence is golden.' > index.php
svn co 'http://plugins.svn.wordpress.org/store-locator-le/' svnrepo
mkdir public

My plugin is now ready for publication. I go to the wp-content/plugins directory and run update_wprepo.sh and answer the prompts whenever I am ready to push to the WordPress plugin directory. The process also build a local zip file in the ~/myplugins directory so I can keep a copy on my servers.   The update_wprepo.sh script will update trunk, create the version branch and even try to clean up any tags or branches if a push of the same version has failed previously.

WP Directory Marketing Assets

Getting asset files into the WordPress repository is also part of the script.   This is the preferred method for storing your plugin banner and screen shots as it lists them on the WordPress site without filling up the .zip distribution that goes to the client.  With the legacy publisher it is easy, though a bit of a manual process.   I store my banner and screenshot files in the ./assets subdirectory (note to self: should create a wp_marketing_images subdir).  Since the assets directory is never part of the published file set they are ignored.   To add assets to the WP Plugin Directory publication I create an assets directory under the svnrepo, copy over the image files I want, and run svn up from the svnrepo directory.

The Scripts

The following scripts are included in the legacy/publisher directory in my WordPress Dev Kit:

common.sh

Setup common Bash variables that the other scripts use to run.

exclude.lst

The list of directories and files to be excluded when zipping up the plugin directory.

makezip.sh

The script that packages up the plugin directory.   Currently with some extra “cruft” to rename some of my plugin directories to a different zip file name when doing the “publish to live server” logic.

update_wprepo.sh

Runs makezip, unpacks the zip file into the assets/public directory to ensure a clean copy of the plugin, copies to svn repository trunk and branches to push to the WP Plugin Directory.