cancel
Showing results for 
Search instead for 
Did you mean: 

problems with writing / reading a data bytea

Silver Adventurer

problems with writing / reading a data bytea

Hello everyone, I have a database with a table with the following structure:

CREATE TABLE videos (
    id int8 NULL,
    data bytea NULL
);


and in it binary files are stored, it is not in my hands to modify its structure, and at the time of writing / reading a specific row it gives me the following error (in that table I have 5 records):

 

select * from videos where id = 2;

 SQL Error [XX000]: ERROR: invalid memory alloc request size 1180388273

I can get all the other rows without problems

if I consult the size of the binaries I get:

select id, length (data) as tam from videos

id, tam

3 10156992
2 590194135
5 193216177
9 381569680
6 424544121

as you can see the id 2 has a size of 590 MB (it is the only one that exceeds 500 MB), I understand that you can store up to 1 GB


the size that returns the error (1180388273) is practically twice the size of the binary error

 Smiley Surprised

 

if I consult the toast table that stores the information of that table videos I get

select chunk_id, count (chunk_seq) from pg_toast.pg_toast_245165 group by 1

chunk_id, count

245203 5089
245205 295689
245207 96802
245209 191168
245211 212698

The only way to write and read this row is with the internal functions of PostgreSQL: lo_export (to write to filesystem the bytea) and lo_import (to load from files system the byta)

Do you have any idea how to solve this or track to follow?

this is a bug?

 

I can not even do pg_dump, I have the same error:

pg_dump: The extraction of the contents of the "videos" table failed: PQgetResult () failed.
pg_dump: Server error message: ERROR: invalid memory alloc request size 1180388273
pg_dump: The order was: COPY public.videos (id, data) TO stdout;


I have changed from disk, server and OS to CentOS 7 the same thing happens, even from PostgreSQL 9.6,


I have tried in the versions:

EnterpriseDB 10.4.9 on x86_64-pc-linux-gnu, compiled by gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18), 64-bit

PostgreSQL 10.5 (Ubuntu 10.5-1.pgdg16.04 + 1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 5.4.0-6ubuntu1 ~ 16.04.10) 5.4.0 20160609, 64-bit
PostgreSQL 9.6.10 on x86_64-pc-linux-gnu (Ubuntu 9.6.10-1.pgdg16.04 + 1), compiled by gcc (Ubuntu 5.4.0-6ubuntu1 ~ 16.04.10) 5.4.0 20160609, 64-bit

 


And the platform is:

16.04.1-Ubuntu

centOS 7.4 

 

 

 

regards

11 REPLIES
EDB Team Member

Re: problems with writing / reading a data bytea

Hi asotolongo,

 

This indicates corruption of the table public.videos.

If you have a latest backup present with you  please proceed with performing point-in-time recovery (PITR) of the table. After completing the PITR execute a pg_dump and make sure it completes successfully as it would address no corruption in the data.

 

If you do not have backup, then you will have to :

1. find corrupt row(s) :

Do a binary search to figure out which row is corrupt :

select * from public.videos limit <N>;

 

Where N=1 to 5 as you have 5 records;

 

Now, whichever query gives you the error, find out the ctid of the row :

 

select ctid from public.videos limit <N>;

 

Its the last ctid thats the one :

delete from public.videos where ctid = '(id, id}';

 

Make sure your pg_dump executes successfully.

 

Hope this helps.

 

Silver Adventurer

Re: problems with writing / reading a data bytea

hi @chaitalirs, thanks for your response 

 

but i dont think that is corrupt teh data on table public.videos, because  i can read/write all data in the table, except thi row 

 

I have noticed that every time I write a file of more than 500 MB I get that error back, all the other files smaller than 500 MB work well for me

 

I can only access this row data from the table videos using the functions lo_export (to write to filesystem byte) and lo_import (to load from files system byte)

 

Can you make a reproduction of this by loading a file of more than 500 MB?

here is:

in a table with this structure:

 

CREATE TABLE videos (
    id int8 NULL,
    dato bytea NULL
);


### python script to load file
import psycopg2
conn_string = "host = 'localhost' dbname = 'db' user = 'postgres' password = 'password' port = 5432"
conn = psycopg2.connect (conn_string)

# a2 is my video with 590 MB
my_video = open ('/ tmp / a2', 'rb'). read ()
cursor = conn.cursor ()

### query = "INSERT INTO public.videos (id, data) VALUES (% s,% s);" % (2, psycopg2.Binary (my_video))
cursor.execute ("INSERT INTO public.videos (id, data) VALUES (% s,% s);", (22, psycopg2.Binary (my_video)))
conn.commit () ### here and get the error



with all the other files smaller than 500 MB it is inserted well

I could only load it with lo_import to file a2 I could only load it with the following function:

 

CREATE OR REPLACE FUNCTION public.bytea_import (
    IN p_path text,
    OUT p_result bytea)
  RETURNS bytea AS
$ BODY $
declare
  l_oid oid;
begin
  select lo_import (p_path) into l_oid;
  select lo_get (l_oid) INTO p_result;
  perform lo_unlink (l_oid);
end; $ BODY $
  LANGUAGE plpgsql VOLATILE

insert into videos select 2, bytea_import ('/tmp/a2') --insert OK

select * from videos where id = 2;

SQL Error [XX000]: ERROR: invalid memory alloc request size 1180388273

if I insert a file of 143 MB it is inserted well and I can read it with a select 

 

the only way to get it out of the database is with the lo_export function:

 

CREATE OR REPLACE FUNCTION public.lo_export_bfile (
    data bytea,
    filename text)
  RETURNS integer AS
$ BODY $
DECLARE
 l_oid oid;
  lfd integer;
  lsize integer;
  msg text;
  msg_detail text;
  msg_err text;
  INV_WRITE int: = 131072;
BEGIN
 l_oid: = lo_create (0);
  lfd: = lo_open (l_oid, INV_WRITE); - writing mode
  lsize: = lowrite (lfd, data);
  PERFORM lo_close (lfd);
  PERFORM lo_export (l_oid, filename);
  PERFORM lo_unlink (l_oid);
  return lsize;
  EXCEPTION
     WHEN OTHERS THEN
      GET STACKED DIAGNOSTICS msg = message_text, msg_detail = pg_exception_detail, msg_err = returned_sqlstate;
      RAISE EXCEPTION 'Error:%,%,%', msg, msg_detail, msg_err;
END;
$ BODY $
  LANGUAGE plpgsql VOLATILE
  COST 100;

select public.lo_export_bfile (dato, '/tmp/video2') from videos where id=2

 

Do you need more information to reproduce it?

regards

EDB Team Member

Re: problems with writing / reading a data bytea

Hi asotolongo,

 

Thanks for sharing the details . 

 

Currenty we are trying to reproduce the the issues with the help of details shared by you ,We will upate you soon with the details .

 

If required more details we will let you know .

EDB Team Member

Re: problems with writing / reading a data bytea

Hi asotolongo,

 

This seems issue with insufficient free memory on the server. Is It possible for you to share memory usage statistics with us?

 

Thanks,

Ninad Shah

EDB Team Member

Re: problems with writing / reading a data bytea

Hi astolongo,

 

In addition to previous response sharing some more information here. Memory available on server should be at least equal to size of file user wish to store. e.g. user can’t store a 500MB file in a database if doesn’t have 500MB of memory available.

 

Thanks & Regards,

Ninad Shah

Silver Adventurer

Re: problems with writing / reading a data bytea

 

Spoiler
 

thanks for your responde @nshah and @Tushar-Takate

 

 

the memory

free -m

 

              total        used        free         shared     buff/cache   available
Memoria:       15920        5816        1797        5234        8306        4427
Swap:          8300         595        7705

shared_buffers, work_mem, maintenance_work_mem 

 

 

 

show shared_buffers ;
 shared_buffers 
----------------
 4GB
(1 fila)

show work_mem ;
work_mem
----------
4MB
(1 fila)

show maintenance_work_mem ;
maintenance_work_mem
----------------------
64MB
(1 fila)

 

 

any other data?

 

 

regards 

 

 

 

EDB Team Member

Re: problems with writing / reading a data bytea

Hi asotolongo,

 

Thank you for your respnose. We are working on data provided by you.

 

Thanks,

Ninad

Silver Adventurer

Re: problems with writing / reading a data bytea

thanks, i will be waiting Smiley Very Happy 

 

regards

Highlighted
EDB Team Member

Re: problems with writing / reading a data bytea

Hi asotolongo,

 

It seems that there is not enough memory to convert the video file in bytea format.

Hence, request you to add additional memory to the server and let us know the results post that.

 

Awaiting for your words.

 

Regards,

Dhananjay