Ruby: what is the pitfall in this simple code excerpt that tests variable existence
- by zipizap
I'm starting with Ruby, and while making some test samples, I've stumbled against an error in the code that I don't understand why it happens.
The code pretends to tests if a variable finn is defined?() and if it is defined, then it increments it. If it isn't defined, then it will define it with value 0 (zero).
As the code threw an error, I started to decompose it in small pieces and run it, to better trace where the error was comming from.
The code was run in IRB irb 0.9.5(05/04/13), using ruby 1.9.1p378
First I certify that the variable finn is not yet defined, and all is ok:
?> finn
NameError: undefined local variable or method `finn' for main:Object
from (irb):134
from /home/paulo/.rvm/rubies/ruby-1.9.1-p378/bin/irb:15:in `<main>'
>>
Then I certify that the following inline-condition executes as expected, and all is ok:
?> ((defined?(finn)) ? (finn+1):(0))
=> 0
And now comes the code that throws the error:
?> finn=((defined?(finn)) ? (finn+1):(0))
NoMethodError: undefined method `+' for nil:NilClass
from (irb):143
from /home/paulo/.rvm/rubies/ruby-1.9.1-p378/bin/irb:15:in `<main>'
I was expecting that the code would not throw any error, and that after executing the variable finn would be defined with a first value of 0 (zero). But instead, the code thows the error, and finn get defined but with a value of nil.
>> finn
=> nil
Where might the error come from?!? Why does the inline-condition work alone, but not when used for the finn assignment?
Any help apreciated :)