Bitmasking in Objective C Bitmasking in Objective C ios ios

Bitmasking in Objective C


I'd recommend changing a few things:

  • The enum values can be changed to be a one left-shifted. Makes it a little easier to write, in my opinion.

  • You don't need to typedef to NSUInteger, you can declare a enum type directly using typedef enum.

  • And, as other people have mentioned, your property shouldn't be a pointer to a Traits type.

My code would look like this:

typedef enum{    TraitsCharacterHonest       = 1 << 0,    TraitsCharacterOptimistic   = 1 << 1,    TraitsCharacterPolite       = 1 << 2,    TraitsCharacterDevious      = 1 << 3,    TraitsPhysicalTall          = 1 << 4,    TraitsPhysicalBeautiful     = 1 << 5,    TraitsPhysicalFat           = 1 << 6,    TraitsPhysicalBigEyes       = 1 << 7,    TraitsPhysicalRedHair       = 1 << 8} Traits;#import <Foundation/Foundation.h>@interface Person : NSObject@property (strong, nonatomic) NSString  *name;@property (assign, nonatomic) Traits     traits;@end

Setting John's traits will look like this:

Person *john = [[Person alloc] init];john.traits = TraitsCharacterHonest | TraitsCharacterOptimistic | TraitsPhysicalBeautiful;

However, while bit-fields are useful to learn, but they're a real pain to debug. If you want to go and print this character's traits now, you'll have to write code like this:

NSMutableString *result = [NSMutableString string];if (self.traits & TraitsCharacterHonest){    [result appendString: @"Honest, "];}if (self.traits & TraitsCharacterOptimistic){    [result appendString: @"Optimistic, "];}if (self.traits & TraitsCharacterPolite){    [result appendString: @"Polite, "];}// etc...

Additionally, syntax for operations like removing a trait are confusing. You'll have to use & and a NOT-ed constant,

// remove 'Tall' traitjohn.traits = john.traits & ~TraitsPhysicalTall

If you can (and performance isn't too much of a issue), I'd prefer using a higher-level feature. Perhaps an NSSet with string constants? e.g.

__unused static NSString *TraitsCharacterHonest = @"TraitsCharacterHonest";__unused static NSString *TraitsCharacterOptimistic = @"TraitsCharacterOptimistic";__unused static NSString *TraitsCharacterPolite = @"TraitsCharacterPolite";// etc...@interface Person : NSObject@property (strong, nonatomic) NSString     *name;@property (assign, nonatomic) NSMutableSet *traits;@end

Then you can do:

// adding[john.traits addObject: TraitsCharacterHonest];// checking[john.traits containsObject: TraitsCharacterHonest];// removing [john.traits removeObject: TraitsCharacterHonest];

Makes more sense to me. What's more, you can print the description of the traits directly with

NSLog(@"John's traits: %@", john.traits);

and you'll get reasonable output.


One issue that you can run into is that using bit masks to indicate membership within sets can be capped by the number of bits in the underlying data type. For instance an unsigned long of 32 bits has room only for 32 disjoint or different members. If you need to add a 33rd, you are out of luck unless you go to a 64 bit unsigned integer.

One workaround for this is to use an array of bytes. With this approach you have to specify your bit membership as two pieces of data, the offset to the byte and the bit mask to use for the specific bit.

I have also seen people use byte arrays for single membership so that rather than one bit used, the entire byte is used. It can be a waste of memory but then it may be that it is more flexible and useful and the amount of memory wasted is not a problem.

For using an array of bytes to hold the set of bits, you might consider using an unsigned long to represent the members of the set in which the least significant byte is the bit mask and the rest of the bytes are used as an unsigned 3 byte offset into the byte array. You would then do something like the following:

int getBitSet (unsigned char *bArray, unsigned long ulItem){    unsigned long ulByteOffset = ((ulItem >> 8) & 0x00ffffff);    unsigned char ucByteMask = (ulItem & 0x000000ff);    return (*(bArray + ulByteOffset) & ucByteMask);}int setBitSet (unsigned char *bArray, unsigned long ulItem, unsigned long ulNewValue){    unsigned char oldValue;    unsigned long ulByteOffset = ((ulItem >> 8) & 0x00ffffff);    unsigned char ucByteMask = (ulItem & 0x000000ff);    oldValue = *(bArray + ulByteOffset) & ucByteMask;    if (ulNewValue) {        *(bArray + ulByteOffset) |= ucByteMask;  // set bit    } else {        *(bArray + ulByteOffset) &= ~ucByteMask;  // clear bit    }    return oldValue;}

You could then have a set of functions to get and set the bytes or you could use macros. With C++ you can create your own class for this functionality and provide various types of logical operations as well so that you can create sets of various kinds and then perform logical operations on the sets.


Your major issue here is making traits a pointer. Drop the pointer, and do it like you would in C:

john.traits |= TraitsCharacterOptimistic | TraitsCharacterOptimistic | TraitsCharacterOptimistic;

Remember that you only need pointers in a couple of situations in Objective-C:

  • When you are dealing with actual objects (derived from NSObject)
  • When you need to pass a primitive by reference (an int * argument to a function to return count), in which case you take the adress of a local variable, and that pointer is not saved by the function.
  • When you need an array of primitive types, dynamically allocated on the heap (e.g. using malloc & friends).

Otherwise, just use a stack-allocated primitive type, as you can do a lot of things with it.