User Rating 0.0 β˜…β˜…β˜…β˜…β˜…
Total Usage 0 times
#3498DB
#
Accepts 3, 4, 6, or 8 hex characters (with or without #)
Swift UIColor
UIColor(red: 0.204, green: 0.596, blue: 0.859, alpha: 1.000)
R
0
G
0
B
0
A
0
Is this tool helpful?

Your feedback helps us improve.

β˜… β˜… β˜… β˜… β˜…

About

Hardcoding raw hex strings in iOS projects leads to runtime crashes when the format is wrong and makes color management across large codebases nearly impossible to maintain. This converter parses hex color codes - including 3-digit shorthand, 6-digit standard, and 8-digit with alpha - and produces the exact UIColor, SwiftUI Color, or Objective-C initializer call with normalized floating-point components in the range 0.0 - 1.0. Each channel value is computed as channelInt255 and rounded to three decimal places. The tool assumes sRGB color space, which matches Apple’s default UIColor initializer behavior.

Limitation: this tool does not account for wide-gamut Display P3 colors. If your design assets use a P3 profile, the output values will be sRGB approximations. For projects targeting CGColor with custom color spaces, additional conversion is required beyond what this tool provides.

hex to uicolor swift color converter hex to swift uicolor generator swiftui color ios color converter objective-c uicolor

Formulas

Each hex color channel is a base-16 pair representing an integer from 0 to 255. The normalized floating-point value used by UIColor is computed as:

component = parseInt(hexPair, 16)255

For 3-digit shorthand (#RGB), each character c is expanded to cc. For example, #F80 becomes #FF8800. Similarly, 4-digit (#RGBA) expands to 8-digit (#RRGGBBAA).

R = parseInt(hex[0..1], 16)255 , G = parseInt(hex[2..3], 16)255 , B = parseInt(hex[4..5], 16)255 , A = parseInt(hex[6..7], 16)255

Where R = red channel, G = green channel, B = blue channel, A = alpha (opacity). If no alpha pair is present, A defaults to 1.000. All values are rounded to 3 decimal places via Math.round(val Γ— 1000) Γ· 1000.

Reference Data

Hex FormatExampleExpansionRedGreenBlueAlphaSwift UIColor
#RGB#F80#FF88001.0000.5330.0001.000UIColor(red: 1.000, green: 0.533, blue: 0.000, alpha: 1.000)
#RGBA#F80A#FF8800AA1.0000.5330.0000.667UIColor(red: 1.000, green: 0.533, blue: 0.000, alpha: 0.667)
#RRGGBB#3498DB - 0.2040.5960.8591.000UIColor(red: 0.204, green: 0.596, blue: 0.859, alpha: 1.000)
#RRGGBBAA#3498DB80 - 0.2040.5960.8590.502UIColor(red: 0.204, green: 0.596, blue: 0.859, alpha: 0.502)
#RRGGBB#E74C3C - 0.9060.2980.2351.000UIColor(red: 0.906, green: 0.298, blue: 0.235, alpha: 1.000)
#RRGGBB#2ECC71 - 0.1800.8000.4431.000UIColor(red: 0.180, green: 0.800, blue: 0.443, alpha: 1.000)
#RRGGBB#9B59B6 - 0.6080.3490.7141.000UIColor(red: 0.608, green: 0.349, blue: 0.714, alpha: 1.000)
#RRGGBB#1ABC9C - 0.1020.7370.6121.000UIColor(red: 0.102, green: 0.737, blue: 0.612, alpha: 1.000)
#RRGGBB#F39C12 - 0.9530.6120.0711.000UIColor(red: 0.953, green: 0.612, blue: 0.071, alpha: 1.000)
#RRGGBB#ECF0F1 - 0.9250.9410.9451.000UIColor(red: 0.925, green: 0.941, blue: 0.945, alpha: 1.000)
#RRGGBB#34495E - 0.2040.2860.3691.000UIColor(red: 0.204, green: 0.286, blue: 0.369, alpha: 1.000)
#RRGGBB#000000 - 0.0000.0000.0001.000UIColor(red: 0.000, green: 0.000, blue: 0.000, alpha: 1.000)
#RRGGBB#FFFFFF - 1.0001.0001.0001.000UIColor(red: 1.000, green: 1.000, blue: 1.000, alpha: 1.000)
#RGB#000#0000000.0000.0000.0001.000UIColor(red: 0.000, green: 0.000, blue: 0.000, alpha: 1.000)
#RGB#FFF#FFFFFF1.0001.0001.0001.000UIColor(red: 1.000, green: 1.000, blue: 1.000, alpha: 1.000)

Frequently Asked Questions

The standard 6-digit hex (#RRGGBB) encodes three color channels: red, green, and blue. Each pair is a base-16 value from 00 to FF (decimal 0 - 255). The 8-digit format (#RRGGBBAA) appends a fourth pair for the alpha channel controlling opacity, where 00 is fully transparent and FF is fully opaque. When no alpha is provided, the converter defaults A to 1.0.
Apple's UIColor API normalizes color components to the 0.0 - 1.0 range because it abstracts away the underlying bit depth. A 0.0 - 1.0 range works identically for 8-bit, 10-bit, or 16-bit per channel representations. This also aligns with how GPU shaders process color data in OpenGL ES and Metal, making the pipeline from UIColor to GPU rendering seamless without integer-to-float conversion.
In 3-digit shorthand, each character is duplicated. #ABC expands to #AABBCC. This means #F00 becomes #FF0000 (pure red), not #F00000. The same logic applies to 4-digit shorthand (#RGBA): #F00A expands to #FF0000AA. This is a CSS specification behavior (CSS Color Level 4) and this converter follows the same rule.
No. This tool outputs values for the sRGB color space, which is what the default UIColor(red:green:blue:alpha:) initializer uses. If you need Display P3 colors, you must use UIColor(displayP3Red:green:blue:alpha:) in Swift. The numeric conversion math is identical, but you must manually change the initializer name in your code. P3 hex values may represent colors outside the sRGB gamut that will be clipped on non-P3 displays.
Three decimal places (e.g., 0.533) provides sufficient precision for 8-bit color. The maximum rounding error is 0.0005, which is below the 1255 β‰ˆ 0.00392 step between adjacent integer values. Using more decimal places adds no visual difference but increases code verbosity. This converter uses 3 decimal places as the optimal balance.
Yes. The converter strips the # prefix automatically if present, and accepts bare hex strings like 3498DB. It also trims whitespace. However, it validates that the remaining string is exactly 3, 4, 6, or 8 valid hexadecimal characters (0 - 9, A - F). Any other length or invalid characters will trigger an error message.