Converting to Char/String from Ascii Int in Swift
Solution 1
It may not be as clean as Java, but you can do it like this:
var string = ""
string.append(Character(UnicodeScalar(50)))
You can also modify the syntax to look more similar if you like:
//extend Character so it can created from an int literal
extension Character: IntegerLiteralConvertible {
public static func convertFromIntegerLiteral(value: IntegerLiteralType) -> Character {
return Character(UnicodeScalar(value))
}
}
//append a character to string with += operator
func += (inout left: String, right: Character) {
left.append(right)
}
var string = ""
string += (50 as Character)
Or using dasblinkenlight's method:
func += (inout left: String, right: Int) {
left += "\(UnicodeScalar(right))"
}
var string = ""
string += 50
Solution 2
Here's a production-ready solution in Swift 3:
extension String {
init(unicodeScalar: UnicodeScalar) {
self.init(Character(unicodeScalar))
}
init?(unicodeCodepoint: Int) {
if let unicodeScalar = UnicodeScalar(unicodeCodepoint) {
self.init(unicodeScalar: unicodeScalar)
} else {
return nil
}
}
static func +(lhs: String, rhs: Int) -> String {
return lhs + String(unicodeCodepoint: rhs)!
}
static func +=(lhs: inout String, rhs: Int) {
lhs = lhs + rhs
}
}
Usage:
let a = String(unicodeCodepoint: 42) // "*"
var b = a + 126 // "*~"
b += 33 // "*~!"
Note that this works with all ASCII and Unicode codepoints, so you can do this:
var emoji = String(unicodeCodepoint: 0x1F469)! // "👩"
emoji += 0x200D // "👩"
emoji += 0x1F4BB // "👩💻"
As a personal note, I wouldn't use this in my code. I would have expected ":" + 40
to become ":40"
, not ":("
. If you prefer the second one where 40
becomes "("
, then this should work well for you :)
Solution 3
If you want only String characters from A... you can use this func:
func characterFromInt(index : Int) -> String {
let startingValue = Int(("A" as UnicodeScalar).value)
var characterString = ""
characterString.append(Character(UnicodeScalar(startingValue + index)))
return characterString
}
Kilian
Updated on April 08, 2020Comments
-
Kilian about 4 years
I'm trying to convert the integer representation of an ascii character back into a string.
string += (char) int;
In other languages like Java (the example here) I can just cast the integer into a char. Swift obviously does not know these and I'm guessing using the all powerful NSString somehow will be able to do the trick.
-
Connor over 9 yearsThat's a good point. And to clean it up, you could add the operator +=
func += (inout left: String, right: Int) { left += "\(UnicodeScalar(right))" }
-
Kamal Upasena almost 9 yearsis there a way to get decimal value from character in swift
-
shadow of arman about 3 yearsLooks like in Swift 5 it's renamed to
Character(Unicode.Scalar(50))
, tiny difference but thought I should mention it.