Coding Languages: Typed vs. Untyped

The bootcamp I just finished taught us to code using Ruby and JavaScript, but my first real experience with coding was using Java. I noticed one big difference between Java and the languages I learned in the bootcamp: in Java, every variable gets a data type when it is declared. I noticed this is a difference that many languages have from others; some languages are typed while others are not. I’m going to talk about my thoughts on this in this blog post.

First of all, I’ll talk about what I mean by types in coding. Any valid variable you have in your code has a type, some of the most common ones being String, Integer or Number, and Object. Each coding language has different ways of treating types; for example, some allow you to add two things of different types, but only certain types can be combined. What I am going to talk about is how, in some languages, you have to explicitly declare the type of a variable, while others have variable with implicit types.

I’ll talk about the differences in declaring and reassigning variables in the three languages I am most familiar with: Ruby, JavaScript, and Java. In Ruby, you don’t have to do much to declare a variable. Anything with valid variable characters you assign to something is a variable in Ruby:

ruby_variable = "I'm a Ruby variable!"

And if you want to reassign that variable to something with a different type, it’s very easy to do:

puts ruby_variable.class 
=> String
ruby_variable = 6
puts ruby_variable.class
=> Integer

From this you can see that the type of a variable is implicitly decided by what the variable is assigned to. As for my understanding of why Ruby works this way, Ruby was designed to be friendly to coders using it. In an effort to make things easier, it allows you to declare things without requiring you to go to too much trouble, and if you want to reassign a variable, you don’t need to worry about what type the variable was before. There is a lot of freedom in Ruby to use variables however you want.

Now, in JavaScript, when first declaring a variable you can’t just assign something with a valid variable name to something, you have to add one of the three keywords before the variable name: var, let, or const.

// doesn't work without using one of the keywords when first
// declaring the variable
jsVariable = "I don't work" // invalid
// uses 'let' so it works
let jsVariable = "I work!" // valid

But in the same way as Ruby, if you want to reassign a JavaScript variable to something with a different type it is also very easy:

console.log(typeof jsVariable)
=> string
jsVariable = 28
console.log(typeof jsVariable)
=> number

JavaScript has the same implicit typing of variables as Ruby. This is probably because JavaScript is very free in how it deals with things of different types. It tries to be helpful by allowing you to combine any different types together, even if the types don’t make sense to combine.

But Java is very different. Similar to JavaScript, when first declaring a variable in Java, you can’t just make a variable and assign it to something:

javaVariable = "I won't work"  // causes an error

You need add the type for the before the variable name. That’s because, in Java, you have to declare a variable with a type when you first define it and you can only assign it to something with that type:

// because javaVariable has the type of 'int',
// you can't assign it to a string
int javaVariable = "I'm not an integer"; // invalid
// declared with the type 'String', so it works
String javaVariable = "I'm a Java variable!"; // valid

Since each variable in Java has a type associated with it, you can’t ever reassign it to something with a different type:

javaVariable = 52;    // causes an error

Once you’ve declared a variable in Java, anything you reassign it to must have the same type as what you declared it with:

javaVariable = "52";  // this is valid

Java is very rigid when it comes to types of things. This is probably because Java has to be compiled before it is run, and it will not compile if it finds for any sort of errors. Assigning types to variables can ensure there are less errors in your code.

And that brings me to my conclusion on the benefits of declaring variable types in code. In my opinion, assigning types can, as I said, make errors less likely or easier to catch. When you have to declare a type, it makes you think more about how your variable is going to be used, which can decrease your chances of using it incorrectly. Declaring variable types can also make it easier to decipher your code, either when you look back at it after a long time, or when someone else looks at it, because variables can have a clearer purpose when they have a clear type.

But languages like Ruby or JavaScript without declared types have benefits too. For one thing, they are much easier for new programmers to pick up because there aren’t as many rules you need to remember. They also make it easier to combine things with different types. For example, you can make arrays that have different types of things stored in them, while in Java, all arrays are declared with a type and everything stored in them must be of that type.

In conclusion, both typed and untyped coding languages have their own advantages and detriments. Different people have a preference for one or the other depending on who they are and what they like about code. And ultimately, I think it is important to know about each way of writing code because most languages use one or the other, and knowing about this will make it easier when learning a new language.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store