Value too large for defined data type

Ask questions regarding Gentoox, Gentoo and Linux in general in these forums and we'll do our best to help you!
Post Reply
lidback
Linux User
Posts: 19
Joined: Fri Aug 12, 2005 6:26 pm

Value too large for defined data type

Post by lidback »

i started my apache, and gonna put a dvd-movie in img format (4.3 gig)
then when i tried to access it from a webbrowser i got error 403... so i looked into my apache and i se that they say: Value too large for defined data type ...
how can i fix it ?

kindly regards, andreas lidback
rocketeer
Pro
Posts: 75
Joined: Sun Aug 17, 2003 12:05 am

Post by rocketeer »

If the filename isn't too long then the file is probably too large.
lidback
Linux User
Posts: 19
Joined: Fri Aug 12, 2005 6:26 pm

Post by lidback »

rocketeer wrote:If the filename isn't too long then the file is probably too large.
Yes i know, it is the filesize that is to large, but i wounder if it can be fixed so the system can read dvd-images that is on about 4 gigabytes
rocketeer
Pro
Posts: 75
Joined: Sun Aug 17, 2003 12:05 am

Post by rocketeer »

I think you'll encounter all kinds of problems with applications handling files > 4 GB on a 32-bit machine.
Can you not split the file (for example with RAR)?
lidback
Linux User
Posts: 19
Joined: Fri Aug 12, 2005 6:26 pm

Post by lidback »

rocketeer wrote:I think you'll encounter all kinds of problems with applications handling files > 4 GB on a 32-bit machine.
Can you not split the file (for example with RAR)?
Yes i do that right now and uses *.rar files. but it will be easier if it is just 1 files to download instead of about 96 rar files.
But how it looks right now i think this is the only choice i have
rocketeer
Pro
Posts: 75
Joined: Sun Aug 17, 2003 12:05 am

Post by rocketeer »

To elaborate a bit more, the message "Value too large for defined data type" could come from either apache itself OR a program used by apache.

If it is that apache can't deal with files larger than 4GB at all, then you will have to find out if apache can be recompiled or patched to deal with large files.

It could be that a program apache uses is throwing that error. An idea that comes to mind is "gzip". Apache is able to compress files on the fly if the user-agent declares it understands it in the http request. Most modern browsers do that.
You could try disable such features in apache config files. Or you could try GETting the file with telnet:

Code: Select all

$ telnet yourhostname 80
GET /path/to/the/file.img HTTP/1.1
Host: yourhostname
<press Enter twice>
If this works you would see a lot of garble on the screen as the download contents comes to stdout. You might replace GET with HEAD first to see if apache returns correct headers or an error message.

The third option is that your filesystem can't handle such large files. If this is the case you would have trouble with commands like ls and stat. Do you use FatX or ext2/3, ReiserFS or some other fs?
lidback
Linux User
Posts: 19
Joined: Fri Aug 12, 2005 6:26 pm

Post by lidback »

I use reiserfs on all my discs i use...
do you know any other than reiserfs that can handle larger files ?

Shallax, do you have any plans to make LFS (Large File Support) compiled in kernel in the future?
or does anyone know?


EDIT: I`ve been reading about this in google and found the answer.
open your console First, define the following environment variables:
export CPPFLAGS=-D_FILE_OFFSET_BITS=64
export CXXFLAGS=-D_FILE_OFFSET_BITS=64
export CFLAGS=-D_FILE_OFFSET_BITS=64


then "emerge gdal"

then when you need to compile and want lfs support just define the enviroment variables before compiling with emerge.
Post Reply