Category Archives: Uncategorized

A simple batching approach to dispatch_async in iOS

Dispatch_async and its cousins have been around for a while, and make a bunch of async operations a whole lot easier.

I have been looking for a nice easy way to perform operations in batches; dispatch_group_notify is nice, but you have no control over the batch size.

Here’s a simple way to batch updates to a mutable array of NSManagedObjects (useful when you want to do some action requiring network connectivity to each one).

Anyway – here it is:

 


-(void)performAction:(void(^)(NSManagedObject*item))action
           inContext:(NSManagedObjectContext*)temporaryContext
    withItemsInArray:(NSMutableArray*)arrayOfItems
    usingBatchSizeOf:(NSUInteger)batchSize
                then:(void(^)(void))finished;
{
    
    // if no items left, return
    if([arrayOfItems count] == 0){
        finished();
        return;
    }

    // take first items
    NSMutableArray *itemsToOperateOn;
    NSMutableArray *leftOvers;
    
    if([arrayOfItems count] < batchSize){
        itemsToOperateOn = [arrayOfItems copy];
        leftOvers = [NSMutableArray new];
    }else{
        itemsToOperateOn = [[arrayOfItems subarrayWithRange:NSMakeRange(0, batchSize)] mutableCopy];
        leftOvers = [[arrayOfItems subarrayWithRange:NSMakeRange(batchSize, [arrayOfItems count] - batchSize)] mutableCopy];
    }
    
    dispatch_group_t actionGroup = dispatch_group_create();
    for(NSManagedObject *itemToOperateOn in itemsToOperateOn)
    {
        dispatch_group_async(actionGroup, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
            action(itemToOperateOn);
        });
    }

    // notify when done
    dispatch_group_notify(actionGroup, dispatch_get_main_queue(), ^{
        NSLog(@" - Batch Operation performed, %i left", [leftOvers count]);
        [temporaryContext save:nil];
        [self performAction:action inContext:temporaryContext withItemsInArray:leftOvers usingBatchSizeOf:batchSize then:finished];
    });
}

Uploading file into SharePoint 2013 online, using an access token

There are two things I dislike about Microsoft.SharePoint.Client.File.SaveBinaryDirect()

WHY U HAVE NO ASYNC?

This blocks for network IO, and we should be able to use Async Await syntax to deal with this. Remember when we were told we had to stop using ServerOM, and then Sandbox? Biggest argument was that we would be able to use all the cool new stuff from the latest version of .NET, as we wouldn’t be constrained by the .NET 3.5 and old school SharePoint internals … YEAH – So why am I still waiting for ASYNC methods on the C# client sdk? </rant>

It doesn’t seem to work nicely with ClientContext’s loaded using an access token.

If you are using the Magical TokenHelper.cs, and grab a ClientContext using an accesstoken, most of the stuff just works great. Apart from SaveBinaryDirect; here, we just get a bunch of errors.

Anyway.

Solution:

Something like this:


public async static Task UploadUsingAccessTokenAsync(string siteCollectionUrl, string serverRelativeFolder, string filename, string accessToken, byte[] data)
{
System.Net.ServicePointManager.Expect100Continue = false;
var requestUrl = string.Format("{0}/_api/web/GetFolderByServerRelativeUrl('{1}')/Files/Add(url='{2}', overwrite=true)", siteCollectionUrl, serverRelativeFolder, filename);
HttpWebRequest request = HttpWebRequest.Create(requestUrl) as HttpWebRequest;
request.Method = "POST";
request.Accept = "*/*";
request.ContentType = "application/json;odata=verbose";
request.ContentLength = data.Length;
request.Headers.Add("Authorization", "Bearer " + accessToken);
using (Stream req = await request.GetRequestStreamAsync().ConfigureAwait(false))
{
await req.WriteAsync(data, 0, data.Length).ConfigureAwait(false);
}
using (HttpWebResponse response = (await request.GetResponseAsync().ConfigureAwait(false)) as HttpWebResponse)
{
using (Stream res = response.GetResponseStream())
{
using (StreamReader rdr = new StreamReader(res))
{
return await rdr.ReadToEndAsync().ConfigureAwait(false);
}
}
}
}

SharePoint OAuthAuthorize.aspx issue with contributors attempting to use on the fly authorisation

SharePoint 2013 online has a great new way of letting external web applications request authorization to read or update data on behalf of a user. You don’t even need an Office365 app, or anything; any web application will do.

Read this pretty great overview on MSDN.

In order to set this up, you basically just:

  • Create a new web application based on one of the many SharePoint app templates  (AppForSharePointWebToolkit is available using nuget … just get something that includes the magical TokenHelper.cs class);
  • Host your web application somewhere online, and make sure it is accessible over https.
  • Register an account using the Microsoft Seller dashboard
  • Generate a new ClientID and ClientSecret using the seller dashboard (you can’t use your own values here)
  • Go and stick the ClientID and ClientSecret values into AppSettings in your web application web.config

Right … so, if you don’t now how to do this, google it.

Anyway …

The issue is when you go to use TokenHelper.GetAuthorizationUrl to generate the URL which requests authorization for a given user on a site. MSDN documentation (and the implementation of the method) suggests that you need to include a “Scope” parameter, and that it will only succeed if the current user has the rights you are requesting.

This is all very well if you are an owner or site collection admin – most of the scopes will work for you: Web.Write, List.Read etc.

BUT … what happens when the current user happens to be a standard contributor, and some of your lists / folders happen to have unique permissions?

Unfortunately, none of these scopes will work … not Web.Read, List.Read … and there’s nothing lower than these. List.Read will only be granted if the current has the ability to Read all lists; something they clearly cannot do.

So – what’s the solution?

All the documentation I’ve come across seems to suggest that you must include this Scope parameter. But, it seems that you don’t actually need to.

So, you could just add something like this to TokenHelper.cs:

public static string GetAuthorizationUrlWithNoScope(string contextUrl)

        {

            return string.Format(

                “{0}{1}?IsDlg=1&client_id={2}&response_type=code”,

                EnsureTrailingSlash(contextUrl),

                AuthorizationPage,

                ClientId);

        }

This now works as expected, with standard contributors being able to authorize access to their SharePoint site. Queries across lists return only what they should see. They can upload files into folders they have permissions to.

Good times.

SharePoint online Javascript CSOM no longer gzips responses?

# UPDATE 1 : This was due to BREACH, a hack which attacks compressed https traffic.

# UPDATE 2 : GZIP compression over https is BACK! (as of January 2014) … the mystery continues …

One of my favourite API’s in SharePoint is the Javascript CSOM (or, JSOM, as some refer to it). It’s certainly nicer to interact with that lists.asmx, or making calls directly to author.dll …

Unfortunately, there seems to be something mysterious going on with compression within SharePoint online.

The issue:

When you do an executeQueryAsync in javascript, your browser should automatically add a “Accept-Encoding: gzip, deflate, sdch” header to the request. This tells the server to zip up the response before it’s passed back.

Back in the day, SharePoint would accept this header correctly, and return a zipped response. Pretty handy when you are returning oodles of JSON. (Would be even handier if it also allowed you to submit a zipped POST – as compression would be even more awesome over oodles of XML … but, maybe a rant for another day).

This is what it used to look like: although the response is 2.8mbs, the server is only returning 120kb.

Nice.

spo-what it should look like

Unfortunately, for some reason SharePoint online is no longer compressing its responses correctly.

So, now it looks more like this: a 40mb response to an executeQueryAsync is transferring the full 40mb. (What on earth do you have to do to get a 40mb response to ProcessQuery? Long story.)

spo-what it does look like

That’s pretty … frustrating.

Anyway – hopeful we see a fix / workaround soon. Seems pretty wasteful.