Calculating bearing between two CLLocation points in Swift [duplicate] Calculating bearing between two CLLocation points in Swift [duplicate] ios ios

Calculating bearing between two CLLocation points in Swift [duplicate]


Here is an Objective-C solution

which can easily be translated to Swift:

func degreesToRadians(degrees: Double) -> Double { return degrees * .pi / 180.0 }func radiansToDegrees(radians: Double) -> Double { return radians * 180.0 / .pi }func getBearingBetweenTwoPoints1(point1 : CLLocation, point2 : CLLocation) -> Double {    let lat1 = degreesToRadians(degrees: point1.coordinate.latitude)    let lon1 = degreesToRadians(degrees: point1.coordinate.longitude)    let lat2 = degreesToRadians(degrees: point2.coordinate.latitude)    let lon2 = degreesToRadians(degrees: point2.coordinate.longitude)    let dLon = lon2 - lon1    let y = sin(dLon) * cos(lat2)    let x = cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(dLon)    let radiansBearing = atan2(y, x)    return radiansToDegrees(radians: radiansBearing)}

The result type is Double because that is how all location coordinates arestored (CLLocationDegrees is a type alias for Double).


This isn't exactly accurate, but you're probably looking for something along the lines of:

func XXRadiansToDegrees(radians: Double) -> Double {    return radians * 180.0 / M_PI}func getBearingBetweenTwoPoints(point1 : CLLocation, point2 : CLLocation) -> Double {    // Returns a float with the angle between the two points    let x = point1.coordinate.longitude - point2.coordinate.longitude    let y = point1.coordinate.latitude - point2.coordinate.latitude    return fmod(XXRadiansToDegrees(atan2(y, x)), 360.0) + 90.0}

I appropriated the code from this NSHipster article that goes into more detail about what's wrong with it. The basic issue is that it's using the coordinates as though the world is flat (which it isn't, right?). Mattt's article can show you how to get the real directions using MKMapPoints instead of CLLocations.