Char to int linux c

How to convert a single char into an int [duplicate]

I have a string of digits, e.g. «123456789», and I need to extract each one of them to use them in a calculation. I can of course access each char by index, but how do I convert it into an int? I’ve looked into atoi(), but it takes a string as argument. Hence I must convert each char into a string and then call atoi on it. Is there a better way?

The string is not really a number, but individual digits, To be exact, a social security number. I want to run a calculation validating the ssn.

11 Answers 11

You can utilize the fact that the character encodings for digits are all in order from 48 (for ‘0’) to 57 (for ‘9’). This holds true for ASCII, UTF-x and practically all other encodings (see comments below for more on this).

Therefore the integer value for any digit is the digit minus ‘0’ (or 48).

char c = '1'; int i = c - '0'; // i is now equal to 1, not '1' 
char c = '1'; int i = c - 48; // i is now equal to 1, not '1' 

However I find the first c — ‘0’ far more readable.

Is there any encoding in which ‘9’-‘0’ != 9 ? I’m not even sure if such an encoding would be allowed per ISO C++.

On encodings and the order of digits, I asked this question stackoverflow.com/questions/782373/…. The short answer is «Any encoding based on Ascii or EBCDIC, yes» (which means 99.9% of encodings we’ll meet in everyday life and the web). Also interestingly the c/c++ standards seem to state that they only support encodings where the digits are ordered.

Is there any encoding where it does not hold ‘0’ < '1' < '2' < '3' It would be at least a very very strange decision

The C++ standard guarantees that ‘0’ through ‘9’ occur adjacently and in the right order in the character set. So the c — ‘0’ works on all systems, whereas c — 48 wouldn’t work on EBCDIC for example.

Note that C11 §5.2.1 Character sets ¶3 says: In both the source and execution basic character sets, the value of each character after 0 in the above list of decimal digits shall be one greater than the value of the previous. The C++ standard will have a similar rule.

Or you could use the «correct» method, similar to your original atoi approach, but with std::stringstream instead. That should work with chars as input as well as strings. (boost::lexical_cast is another option for a more convenient syntax)

(atoi is an old C function, and it’s generally recommended to use the more flexible and typesafe C++ equivalents where possible. std::stringstream covers conversion to and from strings)

You can make use of atoi() function

#include #include int main(int argc, char* argv[])

The answers provided are great as long as you only want to handle Arabic numerals, and are working in an encoding where those numerals are sequential, and in the same place as ASCII.

This is almost always the case.

If it isn’t then you need a proper library to help you.

  1. First convert the byte-string to a unicode string. (Left as an exercise for the reader).
  2. Then use uchar.h to look at each character.
  3. if we the character is UBool u_isdigit (UChar32 c)
  4. then the value is int32_t u_charDigitValue ( UChar32 c )
Читайте также:  Файл настроек сервера 1с linux

Or maybe ICU has some function to do it for you — I haven’t looked at it in detail.

Источник

Convert Linux C Char Array to Int

need some advice on this one as im struggling abit and cannot figure it out. i have a file that gets updated on a PC to indicate a system ran and what time it ran. i am writing a very simple linux console app (will eventually be a nagios plugin). that reads this file and responds depending on what it found within the file. i am a total newbie to programming on Linux and using C so please be patient and if you would explain any answers it would really be appreciated. basically i want to convert a char array containing 5 characters into an integer, however the 5th char in the array is always a letter. so technically all i want to-do is convert the first 4 chars in the array to a integer. how?? ive tried multiple ways with no success, my problem is that presently i do not have a good grasp of the language so have no real ideas on what it can and cannot do. here is the source to my program. basically the buf array will be holding a string taken from the file that will look something like this 3455Y (the number will be random but always 4 chars long). Sorry for the poor formatting of the code, but i cannot get this stupid window for love nor money to format it correctly.

include include include include include include define COPYMODE 0644 int main(int argc, char *argv[]) < int i, nRead, fd; int source; int STATE_OK = 0; int STATE_WARNING = 1; int STATE_CRITICAL = 2; int STATE_UNKNOWN = 3; int system_paused = 0; char buf[5]; int testnumber; if((fd = open(argv[1], O_RDONLY)) == -1) < printf("failed open : %s", argv[1]); return STATE_UNKNOWN; >else < nRead = read(fd, buf, 5); >close(source); if (buf[4] == 'P') < printf("Software Paused"); return STATE_WARNING; >else < return STATE_OK; >time_t ltime; /* calendar time */ struct tm *Tm; ltime=time(NULL); /* get current cal time */ Tm=localtime(<ime); int test; test = Tm->tm_hour + Tm->tm_min; printf("%d", test); printf("%d", strtoi(buf)); > 

To format the code, use the «10101» button at the top of the window that you’re typing your question in

Источник

How convert a char[] string to int in the Linux kernel?

the kernel does not have either atoi nor strtol as such — the «C/C++ standard library» is only available to userspace applications. For many such functions there are functional equivalents in kernel land, though, but not necessarily with the same name.

6 Answers 6

See the various incarnations of kstrtol() in #include in your friendly linux source tree.

Which one you need depends on whether the *buffer is a user or a kernel address, and on how strict your needs on error handling / checking of the buffer contents are (things like, is 123qx invalid or should it return 123 ?).

See also lkml.org/lkml/2011/4/12/361 regarding the kstrtol. () funcs vs. simple_strtol() and/or strict_strtol() . In any case, you’re right if you’re not on bleeding edge use those. See that also regarding «user address».

Minimal runnable kstrtoull_from_user debugfs example

The kstrto*_from_user family is very convenient when dealing with user data.

#include #include #include #include /* S_IRUSR */ static struct dentry *toplevel_file; static ssize_t write(struct file *filp, const char __user *buf, size_t len, loff_t *off) < int ret; unsigned long long res; ret = kstrtoull_from_user(buf, len, 10, &res); if (ret) < /* Negative error code. */ pr_info("ko = %d\n", ret); return ret; >else < pr_info("ok = %llu\n", res); *off= len; return len; >> static const struct file_operations fops = < .owner = THIS_MODULE, .write = write, >; static int myinit(void) < toplevel_file = debugfs_create_file("lkmc_kstrto", S_IWUSR, NULL, NULL, &fops); if (!toplevel_file) < return -1; >return 0; > static void myexit(void) < debugfs_remove(toplevel_file); >module_init(myinit) module_exit(myexit) MODULE_LICENSE("GPL"); 
insmod kstrto.ko cd /sys/kernel/debug echo 1234 > lkmc_kstrto echo foobar > lkmc_kstrto 

Tested in Linux kernel 4.16 with this QEMU + Buildroot setup.

Читайте также:  Softwares supported by linux

For this particular example, you might have wanted to use debugfs_create_u32 instead.

Because of the unavailability of a lot of common function/macros in linux kernel, you can not use any direct function to get integer value from a string buffer.

This is the code that I have been using for a long time for doing this and it can be used on all *NIX flavors (probably without any modification).

This is the modified form of code, which I used a long time back from an open source project (don’t remember the name now).

#define ISSPACE(c) ((c) == ' ' || ((c) >= '\t' && (c) = 'A' && (c) = 'a' && (c) = '0' && (c) while (ISSPACE(c)); if (c == '-') < neg = 1; c = *s++; >else if (c == '+') c = *s++; if ((base == 0 || base == 16) && c == '0' && (*s == 'x' || *s == 'X')) < c = s[1]; s += 2; base = 16; >if (base == 0) base = c == '0' ? 8 : 10; cutoff = (unsigned long)ULONG_MAX / (unsigned long)base; cutlim = (unsigned long)ULONG_MAX % (unsigned long)base; for (acc = 0, any = 0; ; c = *s++) < if (!ISASCII(c)) break; if (ISDIGIT(c)) c -= '0'; else if (ISALPHA(c)) c -= ISUPPER(c) ? 'A' - 10 : 'a' - 10; else break; if (c >= base) break; if (any < 0 || acc >cutoff || (acc == cutoff && c > cutlim)) any = -1; else < any = 1; acc *= base; acc += c; >> if (any < 0) < acc = INT_MAX; >else if (neg) acc = -acc; if (endptr != 0) *((const char **)endptr) = any ? s - 1 : nstr; return (acc); #endif > 

Источник

Convert char to int in C and C++

@Alf P. Steinbach: The original question was vague regarding which language. With keywords c and c++ , I think answers confronting both languages are reasonable.

From my extensive experience on other technical forums, my intuition is that the OP really means «how do I take the textual representation of a number (in base 10) and convert it to the corresponding number?» Generally speaking, C and C++ neophytes usually have incredibly fuzzy ideas about how text works in those languages and what char really means.

@KarlKnechtel: If that’s true (I give it about 50/50 as lots of early tutorials also encourage getting ASCII values out of chars, even though ASCII doesn’t cover the full range), the OP needs to clarity – but that’s a dupe of stackoverflow.com/questions/439573/….

The OP had three hours to clarify this question and failed to do so. As it is, there’s no way to know what is actually asked. Voted to close.

14 Answers 14

Depends on what you want to do:

to read the value as an ascii code, you can write

char a = 'a'; int ia = (int)a; /* note that the int cast is not necessary -- int ia = a would suffice */ 

to convert the character ‘0’ -> 0 , ‘1’ -> 1 , etc, you can write

char a = '4'; int ia = a - '0'; /* check here if ia is bounded by 0 and 9 */ 

Explanation:
a — ‘0’ is equivalent to ((int)a) — ((int)’0′) , which means the ascii values of the characters are subtracted from each other. Since 0 comes directly before 1 in the ascii table (and so on until 9 ), the difference between the two gives the number that the character a represents.

@KshitijBanerjee That’s not a good idea for two reasons: it gives you a negative number for ascii characters before ‘0’ (like & -> -10), and it gives you numbers larger than 10 (like x -> 26)

@kevin001 If you want to convert the char to int and a character ‘1’ provides a ascii number that’s not 1 , you need to remove the offset ‘0’ to realign it to count from 0-9. The consecutive numbers 1-9 are adjacent in the ascii integer number.

Читайте также:  Proxy programs for linux

@foo-bah But I didn’t understand why we have to subtract it with character ‘0’, if we only typecast that character into integer and store it into integer, why it throws error.?

Well, in ASCII code, the numbers (digits) start from 48. All you need to do is:

Or, since the character ‘0’ has the ASCII code of 48, you can just write:

int x = character - '0'; // The (int) cast is not necessary. 

C and C++ always promote types to at least int . Furthermore character literals are of type int in C and char in C++.

You can convert a char type simply by assigning to an int .

char c = 'a'; // narrowing on C int a = c; 

-1 The answer is incorrect for the only meaningful interpretation of the question. This (code int a = c; ) will keep any negative values, which C standard library functions can’t deal with. The C standard library functions set the standard for what it means to handle char values as int .

@Matt: I’m keeping the downvote. I’d strengthen it if possible! The question interpretation you and others have assumed is not meaningful, because it’s too utterly trivial, and because for the OP’s particular combination of types there is a not-so-trivial very important practical issue. The advice you give is directly dangerous to the novice. It will most likely result in Undefined Behavior for their programs that use C standard library character classification functions. Re ref. to @Sayam’s answer, he has deleted that answer.

What do you mean by «always promote»? Values are promoted during implicit conversions, certain types of parameters passing (e.g., to a varargs function), and when an operator must makes its operands compatible types. But there are certainly times when a value is not promoted (like if a I pass a char to a function expecting a char), otherwise we wouldn’t have any types smaller than an int.

char is just a 1 byte integer. There is nothing magic with the char type! Just as you can assign a short to an int, or an int to a long, you can assign a char to an int.

Yes, the name of the primitive data type happens to be «char», which insinuates that it should only contain characters. But in reality, «char» is just a poor name choice to confuse everyone who tries to learn the language. A better name for it is int8_t, and you can use that name instead, if your compiler follows the latest C standard.

Though of course you should use the char type when doing string handling, because the index of the classic ASCII table fits in 1 byte. You could however do string handling with regular ints as well, although there is no practical reason in the real world why you would ever want to do that. For example, the following code will work perfectly:

You have to realize that characters and strings are just numbers, like everything else in the computer. When you write ‘a’ in the source code, it is pre-processed into the number 97, which is an integer constant.

So if you write an expression like

this is actually equivalent to

char ch = (int)53; ch = ch - (int)48; 

which is then going through the C language integer promotions

and then truncated to a char to fit the result type

There’s a lot of subtle things like this going on between the lines, where char is implicitly treated as an int.

Источник

Оцените статью
Adblock
detector