The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

Question re: Microsoft SharePoint Server Task Automation/Macroing

HamurabiHamurabi MiamiRegistered User regular
Hey guys.

So there's a particularly annoying/tedious task at work that requires pulling identical annual budget reports for all of the departments within the (large) organization. I'm trying to see if there's some way to save time by automating the process. Here's a screenshot of the UI the end-user is looking at, in a browser window:

https://drive.google.com/file/d/0B2XtosRxoSm7VEhJLWFkTGVmTDA/view?usp=sharing

The work basically involves changing only the "Department" and "Fund Type" fields, and saving the reports as PDFs (an option under "Actions" in the upper-left). The major slowdown is whenever you have to change "Fund Type"; it takes roughly five seconds to reload every time you do, and you end up having to do this dozens of times to get all the reports you need.

Is there some way to automate this? In short, the task is to:
  • Pull up a pre-determined group of departments (in the "Department" field);
  • Choose the first fund type (under "Fund Type");
  • Save the resulting report as a PDF;
  • Switch the fund type to a second type;
  • And then download that PDF as well.
The process repeats for roughly 30 different departments.

Any ideas? Is this even possible?

Thanks.

Posts

  • OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    edited January 2016
    It's definitely possible, if the browser can access it then it can be scripted. The question is really is it within your skillset and is the system simple enough to script in a reasonable amount of time. I'm not familiar with the details of the system, but here is what I'd be looking at if I was trying to automate something like that.

    The real things that are going to have an affect on how hard or easy this is are:
    1. Was the system designed with this automation in mind, is there an appropriate API
    2. How does the system handle authentication? Do you have to go through a login screen and maintain a cookie, or is there something that's easier to automate like HTTP Basic Auth
    3. Can the system run multiple reports in parallel for a single session, or does it assume that only one runs at once and corrupt the report if you try to run more than one
    4. How much of the logic is done server side, how much is rendered on the client.

    I'm going to assume you've already ruled out the presence of an API for this purpose. If you haven't then I'd suggest goggling for one, and possibly asking on Stack Overflow or similar. If there is an API then this should be a pretty straight forward task. What I'm going to talk about here is how to automate if the system has not explicitly designed in support for it. There are a few different approaches and I'm going to go from what I would consider simplest to most complex.

    Script the HTTP Requests

    The first thing I would do is install Firebug and use the network tab to see what requests are actually made to the server when generating the report. Then see if you can reproduce these by hand. (NB: you don't need to use firebug, Chrome, Firefox, Safari & IE all have built in developer consoles these days with this functionality).

    The best case scenario is you'll see a request to generate the PDF which takes the filter fields you care about (or some id equivalent of them) as parameters un the URL after the ?

    If this is the case then you should be able to just copy the URL into a new browser tab to get the equivalent PDF, and change them to get the other reports. That way you could prepare a list of URLs or a page of links so you can just run through the reports without needing so much interactivity.

    If you're able to make this work you should be able to issue an equivalent request using a command line tool like curl and it should be a trivial exercise to script out all the different reports you need in this way. The challange with this is dealing with the authentication as I mentioned above. Most likely there are some session cookies you need to include on the request. It would be possible to script the login, but what I would do is:

    When you need to generate the reports:
    1. Login in a browser like you do today
    2. Use the developer tools like firebug to copy the cookies into a file somewhere
    3. Include those cookies when issuing the requests from the command line.

    If the parameters are present on the request to get the PDF, but they're in a POST body, you might have to jump straight to the command line as you can't just type these into the browser. You could of course build a HTML form to send them though.

    If the parameters on the PDF request are not obvious, then you need to walk back through the requests and see if you can work out how they are wired through and replicate that. E.g. you might find they send the parameters to the server, get back a token, then use that token to get the PDF. Or that they're sent and stored in the session meaning there are no parameters passed to the PDF request at all.

    The same kind of thing applies though, if you can work it out then you should be able to script it with CURL or similar tools.

    Script the Browser

    If it proves too complicated to work out the flow of requests then it might prove simpler to just script the browser to safe yourself the clicks. There are a couple of different approaches you can use for this. But I would favour using a web testing tool like Selenium.

    Essentially this would allow you automate the browser, you'll have to identify the particular elements to click on, how to wait and identify when things are ready etc. You'd also need to script the login process at this point as these tools are intended for fully automated testing, not just running a macro when you've got to the right place.

    The alternative is to use something like Greasemonkey. Essentially this is a browser plugin that lets you inject your own javascript into an existing site. You could use this to build a script that runs on this page to save you a chunk of the clicking. In this case though I'm pretty sure you'd still have to download the files by hand, but at least you could automate the process of walking through each set of reports you need to produce.

    Orthanc on
    orthanc
Sign In or Register to comment.