We had an internal debate about this, and this seemed to be the consensus: All the sensors will continue to "work" in space, detecting whatever magnetic fields, acceleration and orientation changes that occur there. The issue is that Apps that "use" the IMU data will not work in a meaningful way without changes.
Apps could be modified to work in some meaningful way in space, based on the kind of data they would see in space; Assuming a position perpendicular to the earth's surface, north might be straight down, or mostly down, and gravity would be down, but much weaker, though detectable depending on how far out they are. (If the environment was free fall in a zero G simulator aircraft, then there would be no measurable acceleration we are guessing).
The only sensor that detects similar data in space is the gyro, which measures angular velocity used to track orientation. The gyro orientation drifts, and gets corrected based on other sensors, so the drift may be worse than usual in space unless this algorithm is part of the modifications applied.
Probably, to determine orientation in space, a different approach would be needed. Using optical/IR sensors and SLAM, or perhaps using RF/magnetic transmitters and receivers to determine orientation would be required. We have done such systems in vehicles already to raise the accuracy of the head tracking to a level required.