Some of my favorite programming languages allow you to be "fast and loose" with data types. JavaScript and PHP, for instance, will convert variables between number and string types as needed. Some consider this behavior to be "sub-optimal," while others don't. But understanding how your programming language converts literals or variables between types is important, no matter what language you're using.
Consider the following C program:
#include <stdio.h>When you compile and run this program, it prints out the string "34" and then exits. Now look at this JavaScript function:
int main( int argc, char *argv[] ) {
char *start = "1234";
int a = 2;
printf( "%s\n", (start + a ) );
}
function foo() {
var start = "1234";
var a = 2;
console.log( start + a );
}
It should print out the string "12342". Understanding why C does one thing and JavaScript does another is important. C aficionados can probably quickly point out that the printf() function was taking a pointer to an array of characters as it's input. Adding the integer 2 to the pointer caused it to point 2 bytes ahead. When interpreted as a string, "(start + a)" is simply a two byte string with the value "34".
JavaScript, on the other hand, converts the number 2 into the string '2' and appends it to the string.
Doing the same thing in PHP yields even different results. Executing the following PHP fragment will cause the system to print the string "1236":
$start = "1234";
$a = 2;
echo ( $start + $a )
PHP peeks inside the variable $start, sees that it looks like a number and then converts it to an integer and performs the addition.
JavaScript provides a functions to convert numbers to strings and vice versa. The "String( val )" function attempts to convert the argument 'val' to a string while the "Number( val )" function attempts to convert 'val' to a number. People went to the trouble of specifying these functions and documenting them, so you might as well use them.
Some people, their minds perhaps addled by exposure to early versions of PHP have been seen to do things like this in javascript:
var a = 12;
console.log( "" + a );
or
var b = '34';Adding an empty string to a number in JavaScript will (should) cause the interpreter to convert the value of a into a string. Multiplying the string b by one should do the opposite (convert the string into a number.)
console.log( 1 * b );
Some people believe this type of conversion is faster, others think it's just plain ugly. It is certainly the case that "standard" functions exist to do the same thing, and might convey the programmer's intent more clearly.
It's up to you, of course, which technique you use to coerce a value to a particular type, but if you inherit code with superfluous additions and multiplication, this might be what's going on.
 
