Tokenizing a string with variable whitespace
Posted
by
Ron Holcomb
on Stack Overflow
See other posts from Stack Overflow
or by Ron Holcomb
Published on 2012-12-06T05:01:05Z
Indexed on
2012/12/06
5:03 UTC
Read the original article
Hit count: 190
c++
I've read through a few threads detailing how to tokenize strings, but I'm apparently too thick to adapt their suggestions and solutions into my program. What I'm attempting to do is tokenize each line from a large (5k+) line file into two strings. Here's a sample of the lines:
0 -0.11639404
9.0702948e-05 0.00012207031
0.0001814059 0.051849365
0.00027210884 0.062103271
0.00036281179 0.034423828
0.00045351474 0.035125732
The difference I'm finding between my lines and the other sample input from other threads is that I have a variable amount of whitespace between the parts that I want to tokenize. Anyways, here's my attempt at tokenizing:
#include <iostream>
#include <iomanip>
#include <fstream>
#include <string>
using namespace std;
int main(int argc, char *argv[])
{
ifstream input;
ofstream output;
string temp2;
string temp3;
input.open(argv[1]);
output.open(argv[2]);
if (input.is_open())
{
while (!input.eof())
{
getline(input, temp2, ' ');
while (!isspace(temp2[0])) getline(input, temp2, ' ');
getline (input, temp3, '\n');
}
input.close();
cout << temp2 << endl;
cout << temp3 << endl;
return 0;
}
I've clipped it some, since the troublesome bits are here. The issue that I'm having is that temp2 never seems to catch a value. Ideally, it should get populated with the first column of numbers, but it doesn't. Instead, it is blank, and temp3 is populated with the entire line. Unfortunately, in my course we haven't learned about vectors, so I'm not quite sure how to implement them in the other solutions for this I've seen, and I'd like to not just copy-paste code for assignments to get things work without actually understanding it. So, what's the extremely obvious/already been answered/simple solution I'm missing? I'd like to stick to standard libraries that g++ uses if at all possible.
© Stack Overflow or respective owner