[plt-scheme] How to get tokenize a string?
Hi,
You might not need a tokenizer.
Probably something like that should do it:
(with-input-from-file
"my-file.txt" #:mode 'text
(lambda ()
(let loop ([numbers '()]
[line (read-line)])
(if (eof-object? line)
(reverse numbers)
(loop
(cons
(filter (lambda(x)x)
(map string->number
(regexp-split #px"\\s+" line)))
numbers)
(read-line))))))
Lines are read one by one, and each line is split on spaces with
regexp-split, then empty values are removed (with filter), strings are
turned into numbers with string->number, and all the values are consed into
a list of list of numbers.
Hope this helps,
Laurent
On Sun, Nov 8, 2009 at 08:41, <krhari at ccs.neu.edu> wrote:
> Hi,
> I have a file which contains data like this
>
> 2.5500000e-002 3.6238983e-002
> 2.5500000e-002 3.6238638e-002
> 2.5500000e-002 3.6237603e-002
> ...
>
> I am working on a small animation I need these numbers to calculate some
> things there... I am using 2htdp/universe teachpack...
>
> I actually found string->tokenize function in SRFI doc... Can I use it? If
> yes can anyone guide me how to use SRFI? If no is there any other way out
> other than converting the file into a string then using sting->list and
> doing it in brute force way?
>
> If I use file->string, I get a string like this
> " 2.5500000e-002 3.6238983e-002\r\n 2.5500000e-002 3.6238638e-002\r\n
> 2.5500000e-002 3.6237603e-002\r\n"
>
> Please help me out...
>
> Thanks
> Hari
> _________________________________________________
> For list-related administrative tasks:
> http://list.cs.brown.edu/mailman/listinfo/plt-scheme
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.racket-lang.org/users/archive/attachments/20091108/00cbd0cb/attachment.html>