Creating Your Very First iMacro – Web Browser Automation

Following on from my earlier article on How to Install Firefox & The iMacros Addon which also included some performance tips, this video shows you how to create your very first iMacro and also a more complicated macro, that performs a useful action.

In the video I add an extra command called “WAIT”. The wait command allows iMacro’s to pause for a period of time for an action to be performed and as with all commands in iMacro’s the command needs to be in UPPERCASE.

Installing Firefox & The iMacros Addon + Speed Tip’s

The video below takes you through downloading Firefox and the iMacro’s extension, plus some speed tips to improve the performance of your macros in Firefox.

 

Installing Firefox & The iMacros Addon Video

 

 

Helpful Links

Dealing With “Odd” Data Delimiters In Microsoft Excel

In this video I cover the following topics:

  1. What are CSV files
  2. What data inside a CSV files look like
  3. Why the standard comma separator can sometimes not work
  4. An example of a different separator
  5. How to open such files in Microsoft Excel (at 4:00 minutes in)

Website Data Extraction/Scraping & Form Filling Expert

Due to demand, this service is only available to existing clients and is no longer available.

Website Data ScrapingWeb scraping can be hugely advantageous to businesses, allowing them to function more effectively and keep up-to-date with information that is on specific websites more frequently and accurately.

This is especially true when you consider the applications that can be created can be run by numerous members of staff on an ad-hoc basis or even automated everyday at certain times or that they allow the access to complex data from suppliers for more effective merchandising or keeping their internal systems updated more frequently with stock and pricing information.

This can be also a very quick process too, taking only a few hours to complete most projects and then only taking just seconds for small projects to run and depending on the complexity & speed of the users connection to the internet.

I have several years experience with web scraping over many projects and requirements. In this article I cover the details of extraction in more depth and include examples where suitable. If you have a project in-mind, contact me today with your requirements.

I specialise on small & medium scale scraping projects, such as extracting data from supplier websites for product information, stock & prices updates.

However I can readily tackle multi-tiered extractions and also create clean data from complex situations to import into 3rd party applications with little to no input from the user.

Not only can most data be extracted from most websites, the data can also be posted to websites from data files such as CSV.  This could be form filling for job applications, listing products on to websites or online dating requests, not just extracting product, service or article data from a website.

If you can do it in a web browser, then it can most likely it can be automated.

The possibilities are almost endless.

If have a project in mind, Contact Matthew today, it could be completed in just a few hours.

 

Getting The Edge With Data Extraction

Using automated tools to grab or post data to the web could trim hours off each day or week. Extracting the latest stock & prices from suppliers could mean higher profitability and less back-orders. It could even mean reams of data from suppliers websites and give your business the edge over your competitors.

It doesn’t matter if its behind password protected content, if you can “see it” in your web browser, chances are it can be extracted. If you’re entering data manually into website forms, chances are high that it can be automated too.

I’ve worked on numerous projects where clients have been able to ensure that they’re back-office tools are up to date as possible with the latest information from suppliers and even allowed businesses to work with suppliers they’ve never been able to do with before, because the requirements to extract data from supplier websites has been too restrictive either due to time or cost.

Knowing what your competitors prices are can be a huge advantage when it comes to pricing especially in the eCommerce environment we have today. If you’ve got the data and they can be matched to other sites, within one click and a few minutes, the latest pricing information from competitors could be yours. As many times as you want, whenever you want.

Scraping & data extraction can solve this in a cost effective manner. One script, used over and over. Anytime you want by however members of staff you have.

If you want the edge, Contact Matthew today.

 

The Required Tools Are Free

Using two free applications, the first is the Firefox web browser and a free add-on called iMacros, simple to very complex web automation can be completed.

This allows completed projects to be run by the owner using free-to-use tools, so that any extraction or processing can be run by the owner or staff members as many times as they require and however often they require.

Also extract processing can be obtained using JavaScript to process complex data inputs or extracted data from websites. I cover this in more detail in the “extra data processing” section.

Don’t worry if you’ve never used either of these before, if you’ve used a web browser and can press a button, its that simple. I’ll help you get started and its very easy to do. I also include instructional video’s to get you set up. It’ll take no more than 10 minutes.

Simple Extraction

In this scenario, data elements from a single page can be extracted and then saved to a CSV file.

Example:

This could be a product detail page of a TV and the required elements, such as:

  • Product title
  • Price(s)
  • Stock number
  • Model number
  • Images
  • Product specifics
  • Descriptions
  • Reviews

Are all extracted and then are then saved to a CSV file for your own use.

The time it takes to make a simple extraction of data from a single page varies greatly, this is because the data on the page can sometimes be very poorly formatted or if there are lots of fields that need to be extracted this can take quite some time.

If have a project in mind, Contact Matthew today, it could be completed in just a few hours.

Extra Data Processing

Extra processing can be applied to the extracted data before saving to a CSV file. This is very handy when you only want or require cleaned data to be saved. Most of the time its obvious that cleaning is needed and basic cleaning of the data is included in the macro.

The quickest way of identifying any processing you require on extracted data is to provide an example file of how you would like the final data to look like.

Example:

If one of the extracted fields was a complex data field such as and email address held with other data in JavaScript, such as this:

<script language=”javascript” type=”text/javascript”>var contactInfoFirstName = “Vivian”; var contactInfoLastName = “Smith”; var contactInfoCompanyName = ” REALTY LLC”; var contactInfoEmail = “[email protected]”; </script>

Instead of including the extra information in the export, the email address can be identified and only that data field is extracted. Or if all the data held in the JavaScript is required, this could be split into separate columns, such as:

First Name,         Last Name,        Company Name,              Email Address
Vivan,                   Smith,                   REALTY LLC,                        [email protected]

Also, if the data needs to be formatted for import into a 3rd party application, such as ChannelAdvisor, eSellerPro, Linnworks or a website application, this isn’t a problem either. I’m exceptionally competent with Microsoft Excel & VBA and can help you leverage the gained data and format it in a complete solution that requires the least amount of input from you or your staff.

Even if you have basic requirements or highly complex Contact Matthew today, your data extraction project could be completed in just a few hours and fully customised to your business requirements.

Paginated Extraction

This can vary from site to site, however complex extraction could involve navigating several product pages on a website such as search results, then navigating to each product that is in the search result and then processing a simple extraction or a complex extraction on the products detail page.

Example (property)  – Website: Homepath.com

In this example, not only is the requirement is to extract the data found for a specific property; it is also required for ALL the search results to be extracted.

This would involve extracting all the results and then navigating to each property page and extracting the data on the property detail pages.

The time taken to extract the data from such pages varies on both the number of property results to go through and the amount of data that is to be extracted from the property details page.

Example (products) Website: Microdirect.co.uk

In this example similar to the properties, the requirement is to extract the data from each of the product pages, but to also to extract the product details pages data for all the pages in the search results.

The macro would navigate through each of the page results (say 10), identify each of the products, then one-by-one work its way through the products, saving the data to a file.

Need data from pages & pages of a website? Not a problem, Contact Matthew today, it could be completed in just a few hours.

Ultra Complex Extraction

These normally consist of a requirement of data to be processed from a CSV file, then external processing & scraping by the macro and then possibly depending upon the results, further processing or scraping is to be completed. Such projects are normally very complex and can take some time to complete.

Working with multiple tiered drop down boxes (options) fall into this category, as normally by their very nature can be complex to deal with. It’s also worth noting that is possible to work with multiple tiers of options, for example, when making one section, the results cause sub-options to appear. Sites that need image recognition technologies also fall into this category.

However it’s easier to explain an example rather go minute detail.

Example

For this example, you have a CSV file that has a number of terms that need to be searched for on a dating website, once these searches are made, the details are saved and then it is required to contact/email of the persons through another form.

The macro will make intelligent searches for these terms and the matching results (these are likely to be paginated) are saved to a separate file. Then for each result that was saved, the macro will then are then sent customised contact messages through another form found on the same or different website.

Do you feel your requirements are complicated or the website you’d like to extract from or post to isn’t simple? Contact Matthew today, I’ll be able to let you know exact times & can create the project for you at a fixed cost.

Saving Data & File Types

Extracted data is normally saved as CSV files. The data is separated by commas “,” and will open in Microsoft Excel or Open Office easily. For most applications using a comma will work perfectly.

However sometimes, the data that is extracted is complex (such as raw HTML) and using a comma as the separator causes issues with leakage when viewing in Microsoft Excel or Open Office, this is when using other characters such as the pipe “|” comes in very handy to separate the data fields (eg title and image).

The separator can be any single combination of characters you wish, some common examples are:

  • Comma “,”
  • Tab ”      “
  • Pipe “|”
  • Double pipe “||”
  • Semi-colon “;”
  • Double semi-colon “;;”

It will be quite clear from the onset which separator is required either from the data is being extracted or the projects requirements. If you have any special requirements, please discuss this beforehand.

XML or SQL insert statements can also be created if desired, however this can add several hours onto projects due to its complexities.

File types an issue? I can pre-process data files before-hand in other applications id needed.  Contact Matthew today, it could be completed in just a few hours.

Speed of Extraction/Form Filling

As a general rule, the projects I create run exceptionally fast, however there are two factors that will limit the speed of them:

  • The speed of the website being interacted with
  • The speed of your connection to the internet

You can also make project scripts run much faster by ensuring that the following options in your iMacro’s settings are turned exactly the same as those shown below.

You can find the options page shown below by clicking the “Edit” tab on the iMacro’s side bar, then pressing the button called “Options”.

Imacros Option Panel

Even if you above looks complicated, its not. Instructional video’s are included and I’ll make it exceptionally easy for you. Contact Matthew today, it could be completed in just a few hours.

Exceptions & Un-Scrape-able Pages

It is important that your processing requirements are discussed before hand with examples, so that I can confirm whether or not automated scraping will suit your requirements. In most cases it will do, but sometimes it’s just not possible.

In some cases, it is not possible to extract data from pages over & over due to:

  • A poor ‘make up’ of the page
  • Inconsistent page layouts
  • Page structures that vary enormously from one page to another
  • Use of flash or excessive use of AJAX
  • User code ‘capture’ boxes (like recapture)

When this happens, then the only consistent method of extracting data from such pages is by a human and scraping will unlikely be suitable for your requirements. This is rare, but does occur. If I identify this (and I will very quickly), I’ll let you know ASAP.

I am unwilling to work with data of questionable content. The below above are just common-sense really, I’ve added them for completeness.

  • Adult orientated material (porn is a no, services are a no, ‘products’ are ok)
  • Sites that are focused towards children
  • Identifiable records on people such as medical records (order related data is fine if they are yours).
  • Most government sites
  • In situations where I suspect the data will be grossly miss-used for fraudulent or illegal purposes.

Unsure on what your requirements are or just not sure if web scraping is the right way forwards for your business requirements. Contact Matthew now, I’ll be able to help you and turn it into plain English for you.

What Are the Limits of Extraction/Processing?

Most normal limitations are caused by very large or very deep page requirements of a project. That doesn’t mean they’re not possible, just that it could take some time to code for and also for to run each time by you.

The projects that I create suit smaller scale situations, such as one off extractions or extractions that need to be run by the owner over and over, such as on a daily basis to collect the latest product & pricing information from a supplier.

The real limitations come in to force when the requirements are for huge scale extraction, such as hundreds of thousands of records or exceptionally complex and exceptionally deep extractions. This is when using tools such as Pyhon , C++, Perl or other languages that allow spidering of websites would be more suitable.

This is not a speciality of mine, however due to my experience with scraping, I can assist you with project management of such projects with 3rd party contractors. Contact Matthew now if this is what you need.

Anonymity & Use of Proxies

If you need to keep the activities of such scripting hidden to remain anonymous, then this can be achieved on small scale projects using free proxies with no interaction from yourself.

In larger or more repetitive situations then either I can help you setup your browser to use free to use proxies (can be unreliable at times) or in most cases I’ve found leveraging inexpensive a services that are very easy to use and most importantly reliable.

If this is a concern for you, don’t worry I’ve done it all before. Contact Matthew now if this is a requirement for your project.

Do you provide ‘open’ code?

For ‘small’ or ‘simple’ macros, yes the code is open and you or your development team are able to edit as required.

However for some complex or ultra complex macro’s the code is obfuscated due to the extra functions that are normally included. This is non-negotiable as I have developed custom functions that allow me to uniquely deal with complex situations of data extraction & posting.

Is Web Scraping Legal?

The answer to this can be both yes and no depending upon the circumstances.

Legal Data Extraction
For example if you own the site and data that is being extracted, then you own the data and you’re using it for your own purposes. If you gain permission beforehand, for example from a supplier to extract data from their website, this is also legal.

I have worked on projects where an employee has left a company and there is no access to the back-ends/administration consoles of websites and the only way of obtaining the data held on the site is by scraping. I’ve done BuddyPress, WordPress, PHP-Nuke, phpBB, e107 & vBulletin sites previously to name just a few.

Also I have completed many projects where product data is extracted for use by a business to obtain up-to-date pricing and stock information from suppliers websites, along with extra product & categorisation data too.

Illegal Data Extraction
Because the macro’s are run on your or staff computers, scenarios outside of where the sites are owned or permission has been granted, fall into your discretion.

I cannot be held responsible for any legal actions that may proceed from your actions of running such scripts on 3rd party websites. As such I strongly recommended that you contact the 3rd party to seek consent and check any privacy or usage policies that they may have prior to extraction.

Contact Matthew

If you’ve got a clear idea on what you’d like done or just even if you’re just not sure if its even possible, Contact Matthew today and I’ll be able to tell you if it is possible, how long, how much and when the completed project will be with you.

Removing Old Email from Hotmail

If like me, you’ve had a hotmail account for years, there is a high chance that its probably clogged up with, hundreds if not thousands of emails that you’ll never read and I was finally bugged by logging into Windows each time and it telling me I had in excess over 24,000 unread emails. Not funny.

I Googled a bit and the only helpful post was to contact their customer support and ask for them to be removed, thinking this would probably take ages and knowing I could probably do this in a few minutes with iMacros, I put this little macro to work.

Hotmail iMacro:

SET !TIMEOUT 1
TAB T=1
TAG POS=1 TYPE=INPUT:CHECKBOX FORM=ACTION:/mail/InboxLight.aspx?n=* ATTR=ID:msgChkAll CONTENT=YES
WAIT SECONDS=1
TAG POS=1 TYPE=SPAN ATTR=TXT:Delete
WAIT SECONDS=1

How to make this work

Over to the right you’ll see I cleaned out almost 25,000 emails and its really easy to do.

But first we need three parts to be able to complete this:

  1. FireFox web browser
  2. iMacros plugin for FireFox
  3. This macro (saved as a zip file)

The web browser and iMacros plugin are pretty straight forward, download and install if you don’t have FireFox, then once installed go to the iMacros plugin page and press the download button and follow the on-screen instructions.

Then download the macro and unzip to your macros directory, by default this is in mydocuments > imacros > macros. Once added, open up Firefox and look for the iMacros button in the header, if its not there press “View” along the top, then “Sidebars” and select “iOpus iMacros” and the side bar will appear.

You should see a list of macros down the left column, if you do not see the Hotmail.iim one, at the bottom there are three tabs, press the edit tab and then press the “Refresh Macro List” button as shown to the left.

This will update the current list with the new macro you’ve added, now we’re almost ready to go!

Login to Hotmail and then go to your inbox, to begin with we’ll just run the macro once to ensure that its working, select the macro from the list and back on the play tab, press “Play”.

You’ll see the macro hit the check box at the top of the emails and then delete them. That’s great, but that is only for first page and if like me, you had hundreds of pages to go through we need the little extra option which I’ll explain now.

With the macro selected, we can run this as many times as we wish to, in this case you can see I’ve set it to 270 times. Hit the “Play (Loop)” button and you’re off!

Warning!

This macro is not selective it’ll delete everything, which for me frankly was a great idea, as I had already copied the mail over to GMail a long time ago.

Some tips

If you find that your inbox is not updating fast enough and the macro is hitting the delete button too early, then you will want to lengthen the two lines that say “WAIT SECONDS=1” to maybe 2 or if your connection is very slow 5 or more. Tweak until you find a suitable value.

I also found that the flash adverts would delay the page render times, there is another addon called ‘Toolbar Buttons‘ which includes many buttons, but one of them is a flash toggle, allowing you to temporarily disable flash. Once this addon is installed (and Firefox is restarted), right click the top menu and select ‘customise’ and look for the red F button and add it to the tool bar. Then its just a case of pressing it to enable or disable flash. Another alternative (may be easier) is to use this addon.

Conclusion

iMacros are ace, but the simplest ones are the best of all.

Do you have a bloated hotmail account too?

Amazon IS Human – They Make Mistakes Too!

Relish in this one, its the first time I have ever spotted a mistake by Amazon.

While checking to see where one of the books I ordered on the 5th was too, I noticed in my email an email relating to the new Jeans Store they are promoting.

After taking little look around, mainly to see if they were promoting anyone else other than their own stocked products, I spotted the first mistake I have ever seen on Amazon for one of their own products.

You can view the item here:
http://www.amazon.co.uk/gp/product/B0045OW8TS

Amazon Screen Shot

amazon-is-human

Amazon Makes Mistakes Too!

Spotted it yet? Here it is:

Size: #REF!

A common error in excel when the reference is not found.

Matthew’s Top Tip

You can stop mistakes like this by using the ISERROR() function in excel. Lets assume that we want to verify the source in cell B2 is not #REF! or similar (as this function catches far more than just #REF!).

In A2 we would use this little combination function:

=IF(ISERROR(B2),"An Error is Found","No Error Found")

In English this is:

If B2 has an error in it, do "An Error is Found" otherwise do "No Error Found"

See even Amazon can make mistakes too when it comes to data, although I must admit, from literally the hundreds of thousands of items I have seen on Amazon and help create for sellers on the platform, this is the first time I have seen Amazon make this mistake.

Thinking about it, it would be really easy for them to check the data being imported for common errors such as #N/A, #VALUE!, #REF!, #DIV/0!, #NUM!, #NAME? or #NULL and reject it (although if you take this suggestion, please add a decent error code as the Amazon error codes list are far from useful, saying ‘or another problem’ is not helpful at times *stamps feet*).

Dealing with WordPress Spam Comments – Two Viable Solutions

Ignoring Askami from the conversion which will capture almost all comment spam, you may feel that there is little you can do to stop the wades of WordPress spam comments from being left.

Yes, you could add a captua to the comments box, there are several wordpress plugins that do this, but me being a geek prefer more server based options, here are two of my favourites both with the same effect.

wordpress-comment-spam

.htaccess Redirects

This is the simpler of the two, I have used this for years for keeping banned players out our community websites, in the example below, I replacing it the site with google.com, but it could readily be any site you want, http://yougotrickrolled.com/ is always a good one, I’ll leave the destination to your own selection.

If your hosting provider (or you have enabled htaccess in your Apache config, on by default), then this is a simple, but effective way of redirecting spammers:

RewriteCond %{REMOTE_ADDR} ^188\.143\.232\.39$
RewriteRule .* http://www.google.com [R,L]

This adds a RewriteCond for the IP address and then using RewriteRule sends them to your chosen destination. Most amusing.

http.conf Edits

This is favoured when working in a development environment to keep a site only to specific IP addresses, but it easily works in reverse to keep out entire subnets. After a unfortunate experience with an Indian development company I needed to block four subnets, this worked wonderfully well.


order deny,allow
deny from 125.111.67.240
deny from 122.169

This works by selectively denying either specific IP addresses like in the first line or entire subnets.

If working in a development environment and say your IP was ‘125.111.67.240’ then you could deny everyone else and allow yourself through using:


order deny,allow
deny from All
allow from 125.111.67.240

Enjoy.

Part 2: Using the Split Keyword to Break up Your Data

eSellerPro LogoThis article is a continuation of a previous article called Part 1: Comma Separated Keyword/Tag Blocks in Your eBay Listings? if you’ve not read this article then nip back and read through it, as we’ll need to know the original steps to where we are now.

So we left off with me saying that we should not process further keywords unless we have qualified the custom field so that we actually have something worth continuing with. We did this by using this keyword setup:

{{IFNOT/[[CustomFields:Variations:Other Colours]]// do something }}

So if the value of ‘Other Colours’ is not blank we should ‘do something’; Well lets look at the ‘something’. This is where it really gets good:

{{Split/Value To Split/Spilt Character(s)/ __SplitValue__ }}

Scary? Na, lets pop some values in here and we’ll talk it back in plain English on what the ‘Split’ keyword is going to do for us:

{{Split/{{CustomFields:Variations:Other Colour}}/, / <b>__SplitValue__</b>}}

Keeping this simple, lets assume we have ‘Blue,Red’ in our custom field for ‘Other Colours’ and that you know that the <b> and </b> are HTML tags and make text bold. So here it is:

Foreach ‘Split’ of the value of ‘Other Colours’ we are going to make <b>__SplitValue__</b>

Easy eh? Lets no use this with the two values ‘Blue,Red’ in our custom field for ‘Other Colours’, it would make the following:

<b>Blue</b><b>Red</b>

Wow, are you getting the power of this keyword yet? I hope so, lets keep going and beef this out into something more usable. As the complexity of the keyword is going now grow rapidly, I’m going to be using syntax highlighting on the code so its easier for you to read:

{{IFNOT/[[CustomFields:Variations:Other Colours]]//

	

	{{Split/{{CustomFields:Variations:Other Colours}}/, / 
		Find more items in __SplitValue__
		 , 
	}}.

}}

So this would make the following:

Find more items in Blue, Find more items in Red.

A quick note on the URL I used, I just simply went to eBay, picked the nearest store, in the search box on the left I entered ‘Blue’, but crucially ticked the box called ‘in titles & descriptions’ and chopped of the _SID=NNNNNN off the end, if you’re unsure, leave a comment on this post. I chose the ‘in titles & descriptions’ option, as I very much doubt any of you are spamming the titles with all the colour variations and for the super smart ones out there, instead of searching for just ‘Blue’, you would be prefixing these style colours with something like ‘sBlue’ so that the colour matching using this technique is absolute in its results (not clouded by junk results on ‘blue’).

Next Steps

Taking this further, lets assume you have made some colour swatches in images that are 50×50 pixels (we could do some further IF statements to use HTML colour codes, but thats way out for the purpose of example) and also you have entered your sizes into the ‘sizes’ custom field we first discussed, this could make something like:

{{IFNOT/[[CustomFields:Variations:Other Colours]]//

	

This item is available in other colours, pick you colour:

{{Split/{{CustomFields:Variations:Other Colours}}/, / Find more items in the __SplitValue__ colour }}

	
}}
{{IFNOT/[[CustomFields:Variations:Sizes]]//

	

This item is available in other sizes, pick you size:

{{Split/{{CustomFields:Variations:Sizes}}/, / Find more items in the __SplitValue__ size }}

	
}}

Summary

Thats quite a chunk of code to take in, but in simple terms for each colour it’ll bring in a swatch image of that colour and link it, then do similar task for image that are named ‘size-3.png’ etc… Neat eh?

Now some might say, ‘well eBay do variations now, I don’t need size or colours in the listing…’, thats right they do, but this example can easily be expanded upon for other values, like years of manufacturer if you’re selling roof racks, or perhaps this item is part of a range that is not being listed as multi level variations just single variations. You’re only limited here by your imagination the application of your data.

The point is with some thought and the right application of the tools & data at your disposal, you can actually have a targeted exit strategy to your eBay listings.

PS: For the XHTML junkies out there ‘border=”0″‘ is not valid, you’d want to use a CSS style or something :)

Part 1: Comma Separated Keyword/Tag Blocks in Your eBay Listings?

eSellerPro LogoClearing out my old files earlier and I came across some old keywords I used on numerous occasions. Instead of  just detailing one, I’m actually going to join a few together here to make a example anyone using eSellerpro could use with a little thought.

Lets Make a Real-Life Example

Lets assume that you have two customfields the first with ‘Other Colours’ and another with ‘Sizes’, both in the custom fields group called ‘Variations’. These are two very common fields for anyone who deals with variation products, they could of course be ‘Languages’ for say DVD’s or ‘Years’ for of applicable models and so on…

So lets get right in and cover the first keyword, which is lets get value of the first field out so we can use it:

{{CustomFields:Variations:Other Colours}}

This keyword is in the format of ‘CustomFields:GroupName:FieldName‘. CustomFields calls the Custom Fields, GroupName is vitally important because it was found that the keyword only pulls in the values of the customfields from previous customfield calls, so if the value we were looking for was in a different group, then it would not be resolved; And finally the FieldName which is the internal name of the custom field (as opposed tot he display name which can be different).

So using the keyword ‘{{CustomFields:Variations:Other Colours}}‘ we can pull out the values in the ‘Other Colours’, for the sake of this example these other colours are ‘blue,green,red’. Noticing they are separated by commas, this is extremely important, as we’ll be using the ‘comma’ to spilt them up shortly.

Wait!! Lets Error Check

Now before we go any further, we need to error check ourselves. What do I mean by this? I do not think its a good idea to show or process any further code if there are not any ‘other colours’ to be showing to the viewer, we do this using another keyword.

There is two variations of this new keyword, IF and IFNOT. Both of these allow to check to see if a condition is true (there is another for IF/ELSE and IFNOT/ELSE but thats not required for this example, plus you could just alternate the IF/IFNOT to capture the alternative if its a 1:1 check, anyway back on topic…). These keywords are in the format of:

{{IF/Value being Checked/Value to Check Against/Output if True}}
{{IFNOT/Value being Checked/Value to Check Against/Output if True}}

Looks scary right? Naa its easy, lets do a real example. Lets pretend we have an colour blue in our custom field ‘Other Colours’ and IF we find blue, lets bring in an image that is the colour blue (for the smart ones, you can see where we could go with this):

{{IF/[[CustomFields:Variations:Other Colours]]/Blue/ <img src=”some-blue-image.png” />}}

Now you’ll notice the use of [[  & ]]  and not {{ & }}, this is for a very good reason, image the custom field value contained ‘Blue/Red/Green’, this would break the earlier format of {{IF/Value being Checked/Value to Check Against/Output if True}} and a good practice is to always use the ‘square’ brackets and also do your best to avoid forward slashes ‘/’ in you data.

So if the value of the ‘Other Colours’ was ‘Blue’ then we would have had an blue image appear and if there was not, ‘nothing’ would have been output from this statement.

That brings us very nicely onto ‘nothing’, going back to previously mentioning its not a good idea to show or process code when there is no need, we need to check to make sure the ‘Other Colours’ actually has something in it, so we can continue, we do this by using this keyword set up:

{{IFNOT/[[CustomFields:Variations:Other Colours]]// do something }}

This says in plain English: IF the value of Other Colours is NOT blank (thats what the // is ) then do something.

End of Part 1

This feels a good place to stop, I’ll cover off the next stage of this little session in my next post. I hope I have at least got your creative juices flowing, I know what I’m thinking, perhaps we could have a set of colour images and those images being linked, so that customers could pick their alternative colours or sizes using a user friendly block to your eBay listings or may be a ‘year picker’. Hey the options are endless!

View part 2 here:  Using the Split Keyword to Break up Your Data

PHP: Search for String in a String Using strpos

This is more for my own reference than anything else, working on a recent project where specific terms and stock numbers needed to be excluded, the below function quickly allowed me to action the excluded items to ensure their exclusion.

Its different to the other options out there because of the use of ‘!==’, there is an issue when using strpos, that is if finds the string at position 0, then it gives a false positive, where as using ‘===’ or ‘!==’ checks for ‘identical’ or ‘not identical; matches.

Possibly the most boring post yet :)

//Decalre Vars
$haystack 	= array (
				"123",
				"456",
				"789"
			);

if(CheckIfExists($item->Title)) {
	// Do something if its found
}
						
function CheckIfExists($needle)
{				
	global $haystack;
	
	foreach($haystack as $needles) {
		
		if (strpos($needle, $needles) !== false) {
			//echo "We have a banned Item!! - " . $needle;
			return true;
		}
	}
	return false;
}

Yoast WordPress Breadcrumbs Plugin

Yoast.comTaking the advice from yesterdays post ‘Google SEO Starter Guide Updated’ on adding a breadcrumb to the sites theme, I remembered seeing a plugin from Yoast.com a few days back.

Installation was done in seconds, almost like every other WordPress plugin, unfortunately the auto insert option did not work, however within a few pastes in the files page.php, search.php and single.php it was in and working.

If you’re an avid blogger and the theme you are using does not come with breadcrumbs by default, this plugin was sooo easily added, even at code editor level, its worth adding and as a bonus its free.

Yoast.com also has a collection of other plugins for WordPress, you can see them here.