High Precision Job Scraping

Job Wrapping | Scraping 

Accurate & effortless job wrapping from career websites. Wrapping quality monitoring and support. Request a quote.


Replicate jobs to post to multiple locations and categories

SpiderMount can duplicate any vacancy and post multiple jobs altering selected field, i.e. location, job type etc. Therefore one does not have to copy jobs manually and change fields. Spider generates multiple jobs by adding extra locations or other data to original single job content.


Job boards and aggregators requiring same job to be posted separately to appear in candidate search for each specific location benefit from spider replication tool. Job scraping software will replicate original job content adding relevant location and post it to the job aggregator. Normally this feature is used by job boards and aggregators that do not support multiple searchable location selection for one vacancy.


For example, employer has a developer opening in London and a list of new job titles and locations where the opening is available as well. SpiderMount saves employer’s time by simply duplicating vacancy and altering title and location field. Afterward spider will post number of openings to job distribution platform.


job replication feature by job scraping software


Learn more about job scraping features.

Properties data scraping with SpiderMount software

Data wrapping service by SpiderMount is a web based data scraping tool, which has a variety of business applications. Among others, SpiderMount wrapping software may be used for scraping properties (i.e real estate) data from different source sites and automatic posting to the specialized board or saved in XML/CSV format.


Properties data scraping with SpiderMount software


SpiderMount wrapping service spiders and extracts web-based real estate data, cleans up formatting and saves / auto-posts in XML or CSV format.


Features for properties data wrapping:

  • Scrapes estate information from websites (HTML or XML) or via FTP.
  • Incremental scraping feature downloads only new entities.
  • Filter pages by keywords so only relevant data is scraped.
  • Auto-replace keywords in content & clean up the formatting.
  • Keywords auto-replacement & html formatting cleanup.
  • Schedule hourly, daily or weekly spidering / posting sessions.
  • Auto-post via XML/CSV to single or multiple destinations.
  • Custom website API configuration for auto-posting
  • Post to HTTP interface, via API, SOAP, to FTP or email.


SpiderMount properties (real estate) scraping service ensures:

  1. Confidence in delivery
  2. Seamless support
  3. Effortless integration


Do not hesitate to request more information or schedule a demo.


Read more about job wrapping and resume scraping.


Automated posting to Amazon S3 bucket

SpiderMount adds new job wrapping output option to Amazon S3 bucket. Now, in addition to uploading scraped jobs file to recipient API, FTP or sFTP folder, or getting a direct link with XML or CSV file, job wrapping results can be uploaded directly to Amazon cloud storage.


Amazon Simple Storage Service (Amazon S3) is a secure, durable, highly-scalable cloud storage, with a web service interface to store and retrieve data.


Learn more about automatic posting options:

Automatic Job Posting Options Overview PDF

Automatic posting to WordPress job board via WPAllImport plugin


CSV files scraping feature added

SpiderMount can now source jobs from CSV files. Jobs data can be converted, cleaned and enhanced to be posted automatically to recipient job board API via XML or other means.

Job wrapping service automates job posting by downloading CSV file from dropbox or other URL, FTP, or from the spider folder. Original CSV file can be of any format, i.e. columns, content.

CSV file preview
Resume data scraping:

Resumes exported via CSV can also be automatically parsed and posted to target recruitment database, i.e. ATS.

Resumes scraping

SpiderMount data wrapping service may be used for resume data scraping, conversion and posting to the recruitment database. Update resumes from multiple sources, i.e. job boards and employment websites to keep all the application data in one database and format.


Structure wrapped resume data for better and faster application analysis. Benefit from faster and convenient candidate search.


SpiderMount resume wrapping service ensures:

  1. Confidence in delivery: comprehensive resume extraction coverage and high accuracy.
  2. Seamless support: client’s resources are freed from daily checks by SpiderMount automated monitoring & support team.
  3. Effortless integration: resume scraping service connects to job board & ATS APIs, proprietary systems.




SpiderMount features for resume wrapping:

           Scrapes resumes from websites (HTML or XML), ATS or via FTP.

           Incremental scraping feature only downloads new entities.

           Scraping under logged in account.

           Auto-replace keywords in content & clean up the formatting.

           Schedule regular spidering / posting sessions.

           Auto-post via XML or CSV to single or multiple websites/databases.

           Post to HTTP interface, via API, SOAP, to FTP or email.


SpiderMount easily integrates with wide range of job boards software and other job posting platforms.


Contact us for more information and demo.

Easy preview & verification for scraped jobs XML

SpiderMount releases preview feature for its job scraping service. Latest update is aimed to simplify verification and approval for new job scrapes configured. Easy preview allows to verify scraped job description formatting, i.e. line spacing, bold fonts, paragraphs, bullet points, etc, as well as fields mapping.


Comparing to standard XML view, HTML preview does not require additional parsing or specific knowledge: user can validate configuration of the scraped data in one click. Easy to understand preview drastically decreases time spent on new job scraping results approval.


Standard XML feed



HTML preview



First 750 chars of job description provide reasonable preview to check formatting and get an idea on what title job description relates to. Whilst keeping job listing in a concise format for quick job scrolling within browser window. Moreover preview is a real time conversion of a live XML file. Therefore it always displays exact content offered by XML file.


Create jobs XML feeds for job boards & aggregators

SpiderMount provide an opportunity to publish your jobs via XML feeds (syndicate or broadcast) to aggregator websites and other job boards: Indeed, Oodle, Simplyhired, Trovit, Juju, Jooble, Adzuna etc.


Job wrapping software scrapes vacancies data from your job website and configures jobs into requested format.  SpiderMount job wrapping service then adds unique ID to the job and maps fields according to aggregator website requirements.


Resulting Indeed feed abstract, posted by job board software (sample screenshot):



SpiderMount integrates a job taxonomy service

Most of the modern job boards are benefiting from job wrapping or spidering services. However, they still encounter an issue with extracting correct industry category for the vacancies posted by employers. SpiderMount introduces new service to resolve the problem for automated job scraping – Job Taxonomy.


Job Taxonomy is automated industry/category recognition with the mapping based on available job data, i.e. job title. Online service analyses jobs content and defines relevant industry category data.


There are two options available to integrate service into your current solution or service:


Option 1: Job Taxonomy API

Job boards, ATSes, online recruitment services can send jobs XML data to JobTaxonomy API and obtain Industry category IDs mapped to jobs.




Option 2: Bundle With Job Wrapping service

Job spidering service in combination with JobTaxonomy scrapes jobs from employers’ websites lacking Industry Category listings and automatically defines relevant job industry sector/category with further update in the scraped data.




For more information: www.jobtaxonomy.com

Learn more about job scraping features.


New Job Posting Interfaces Added

SpiderMount job wrapping added a number of job posting options: popular bulk posting tools APIs, REST, SOAP interfaces, JSON option.


Job wrapping service can post to your job board / database via following options:
1. Push jobs XML, CSV, JSON file to your URL via various HTTP methods, i.e. POST, PUT, GET
2. Upload jobs XML/CSV file to your FTP folder
3. Submit into HTML form
4. Provide a link with XML or CSV file for your software to download from*.
7. Your proprietary API: custom interface can be integrated.


*Download XML option can be used for automatic posting to WordPress job board via WPAllImport plugin.


Download full file: Automatic Job Posting Options Overview PDF.

Automatic job posting to WordPress website via WPAllImport

SpiderMount scraped jobs output can be automatically posted to WordPress based websites / job boards via WPAllImport service (http://www.wpallimport.com).


WordPress-based job board owners can have their clients jobs scraped by SpiderMount job wrapping service and auto-posted to their website databases via WPAllImport:


SpiderMount will scrape jobs from single or multiple employer sites and provide a single jobs XML containing all employers jobs. Jobs XML can then be downloaded by WPAllImport tool and posted to WordPress database. One can test upload the file manually or organize fully automated daily update.


WordPress website configurations required:
WordPress website will require WPAllImport plugin installed in order to be able to import jobs from XML file. Any WPallimport service subscription will suffice to organize automatic jobs import from SpiderMount jobs XML file.
wpallimport preview


Setting your website for daily jobs update:


Daily or custom scheduled processing is to be activated via “cron jobs” so that server tasks scheduler regularly launches wpallimport script.


Contact SpiderMount for more details on job posting to WordPress-based website.

Home   Job Wrapping   Demo   Request Quote  
© Aspen Technology Labs