Google's indexing of the OEIS

N. J. A. Sloane njas at research.att.com
Wed Jan 5 22:42:41 CET 2005


Thanks to everyone who contributed to this discussion.

The most significant reply came from Tony Noe, who said:

(quote)

When Google finds many matches at a single URL,
it normally returns only a subset of all the matches.  
Then, on the last page, it shows the message

"In order to show you the most relevant results,
we have omitted some
entries very similar to the 50 already displayed.  
If you like, you can
repeat the search with the omitted results included."

I tried this with "t. d. noe" (in quotes).  
Initially, Google found fewer than 50 matches.  
When I asked for the omitted results, it found 228 matches
-- with a match in almost every part of OEIS.  
So I think Google is working fine.

(end quote)

So I also think Google is working fine!

---------------
About the problem with "Arg list too long", 
thanks to everyone who suggested ways around it

I had said:

> If you try to execute  something like
> 
>      ls oeis* | ...
> 
> and there are too many files oeis*, you get an error message
> 
>      Arg list too long
> 
> I didn't actually see this message, 
> because the ls command is buried deep
> in the shell scripts, but I assume that 
> is what was causing the "lookup"
> command to fail.

Some of the suggestions were:

   ls | egrep '^oeis' | ...

   ls | grep oeis | ...

   find -name "a*" -prune | ...

and "xargs" is the command that I would use myself.

But the trouble is deeper than that. You can get the
Arg list too long  error message from other shell commands
too (rm, awk, etc.) and my lookup programs have several thousand
lines of shell.  I don't have time right now to
go through them to make them more robust to this kind of error.

So i have restored all the programs to the way they
were yesterday, and it appears that everything is working
again.  And anyway, Tony Noe's comment suggests
that Google is doing a good job anyway!

NJAS






More information about the SeqFan mailing list