Home » RDBMS Server » Server Administration » large table (Oracle 7.3.4.5.0, Solaris 8)
large table [message #675535] Thu, 04 April 2019 03:34 Go to next message
reym21
Messages: 241
Registered: February 2010
Location: Philippines
Senior Member

Gurus,

Please help me on what to do with the table that would take much time
before you retrieve its record count by:

select count(*) from user_ledger_trans;

This table has a PCT_FULL percentage of 99.95% (ALLOCATED_MB=12,680; USED_MB=12,673.87).
We only identified it's num_rows (141,704,779) by executing:

select * from dba_tables
where table_name='USER_LEDGER_TRANS';

Our end-users cannot proceed anymore to process any update on this table.
The table size is around 2GB already.

Thank you in advance.
Re: large table [message #675536 is a reply to message #675535] Thu, 04 April 2019 03:56 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
Quote:
cannot proceed anymore to process any update
This is not an error message that I have seen before. Perhaps you have some other messages that might be more helpful?
Re: large table [message #675538 is a reply to message #675536] Thu, 04 April 2019 04:08 Go to previous messageGo to next message
cookiemonster
Messages: 13917
Registered: September 2008
Location: Rainy Manchester
Senior Member
2GB isn't actually that much. The largest table I've got is 14GB.

If you've got slow queries/updates you need to look at the execution plan and see what it's doing.
We can't makes any useful suggestions with the information you've provided so far.
Re: large table [message #675539 is a reply to message #675536] Thu, 04 April 2019 04:10 Go to previous messageGo to next message
gazzag
Messages: 1118
Registered: November 2010
Location: Bedwas, UK
Senior Member
I think that that was more of a statement by the OP than an error message, John.
Re: large table [message #675542 is a reply to message #675535] Thu, 04 April 2019 05:37 Go to previous messageGo to next message
Michel Cadot
Messages: 68625
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator

Quote:
ALLOCATED_MB=12,680
Quote:
The table size is around 2GB already.

Something inconsistent between these 2 figures for me.

Re: large table [message #675545 is a reply to message #675539] Thu, 04 April 2019 06:24 Go to previous messageGo to next message
EdStevens
Messages: 1376
Registered: September 2013
Senior Member
gazzag wrote on Thu, 04 April 2019 04:10
I think that that was more of a statement by the OP than an error message, John.
I think that was John's way of saying "you've provided no useful information with which to address your problem".
Re: large table [message #675547 is a reply to message #675545] Thu, 04 April 2019 08:28 Go to previous messageGo to next message
gazzag
Messages: 1118
Registered: November 2010
Location: Bedwas, UK
Senior Member
Embarassed

Thanks Ed. Sorry John Smile
Re: large table [message #675548 is a reply to message #675535] Thu, 04 April 2019 08:44 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
reym21 wrote on Thu, 04 April 2019 01:34
Gurus,

Please help me on what to do with the table that would take much time
before you retrieve its record count by:

select count(*) from user_ledger_trans;

This table has a PCT_FULL percentage of 99.95% (ALLOCATED_MB=12,680; USED_MB=12,673.87).
We only identified it's num_rows (141,704,779) by executing:

select * from dba_tables
where table_name='USER_LEDGER_TRANS';

Our end-users cannot proceed anymore to process any update on this table.
The table size is around 2GB already.

Thank you in advance.

Consider to collect statistics on USER_LEDGER_TRANS table & its indexes
Re: large table [message #675576 is a reply to message #675548] Sun, 07 April 2019 21:49 Go to previous messageGo to next message
reym21
Messages: 241
Registered: February 2010
Location: Philippines
Senior Member

Sirs,

Thank you for your prompt replies.
I got some details using Toad (sorry but I can't format it, just refer to the attachment):


Table Owner 	Table Name 		Mb Allocated 	Mb Used 	Pct Full (before next extend) 	Tablespace
--------------- ----------------------- --------------- --------------- ------------------------------- ----------------
ABC 		USER_LEDGER_TRANS 	18580 		18560 		99.89 				USR_DATA_LRG

Parameter 	Value
--------------- -----------------
Size 		18 Gb
Number Extents 	830
OWNER 		ABC
TABLE_NAME 	USER_LEDGER_TRANS
TABLESPACE_NAME USR_DATA_LRG
CLUSTER_NAME
PCT_FREE 	20
PCT_USED 	60
INI_TRANS 	1
MAX_TRANS 	255
INITIAL_EXTENT 	2 Gb
NEXT_EXTENT 	20 Mb
MIN_EXTENTS 	1
MAX_EXTENTS 	1200
PCT_INCREASE 	0
FREELISTS 	1
FREELIST_GROUPS 1
BACKED_UP 	N
NUM_ROWS 	141704779
BLOCKS 		2375678
EMPTY_BLOCKS 	2560
AVG_SPACE 	1661
CHAIN_CNT 	2
AVG_ROW_LEN 	14
DEGREE 		1
INSTANCE 	1


Also, what our users had experience were the 2 errors below:
"F 02084: Batch driven execution failed. Cancel to exit." and
"I 01025: A fatal error has occurred. Contact Help Desk."

Do I need to recreate the table or the tablespace?

Thank you very much for your time.


@Michel: I came up with the 2Gb size by multiplying the AVG_ROW_LEN by NUM_ROWS = 1.84 Gb.
*BlackSwan added {code} tags. PLEASE do so yourself in the future!
  • Attachment: abc_toad.jpg
    (Size: 24.32KB, Downloaded 1861 times)

[Updated on: Sun, 07 April 2019 22:01] by Moderator

Report message to a moderator

Re: large table [message #675577 is a reply to message #675576] Sun, 07 April 2019 22:05 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
>Please help me on what to do with the table that would take much time
We can't say what is wrong since you decided that we did not need to see the actual slow SQL.


Since NOBODY can optimize SQL just by looking at it, we need a few more details.
http://www.orafaq.com/forum/mv/msg/84315/433888/#msg_433888
Please refer to URL above & be sure to provide the details requested:
1) DDL for all tables & indexes
2) EXPLAIN PLAN
3) output from SQL_TRACE & tkprof
Re: large table [message #675584 is a reply to message #675576] Mon, 08 April 2019 01:22 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member


Quote:
"F 02084: Batch driven execution failed. Cancel to exit." and
"I 01025: A fatal error has occurred. Contact Help Desk."
These do not look like Oracle error messages to me, more like something generated by your application. Better ask your developers or application vendor what they mean.
Re: large table [message #675586 is a reply to message #675584] Mon, 08 April 2019 02:28 Go to previous messageGo to next message
Roachcoach
Messages: 1576
Registered: May 2010
Location: UK
Senior Member
>>Oracle 7.3.4.5.0, Solaris 8

I'll be surprised if the application devs are still in the game and even more surprised if they can remember this.
Re: large table [message #675589 is a reply to message #675586] Mon, 08 April 2019 03:31 Go to previous messageGo to next message
reym21
Messages: 241
Registered: February 2010
Location: Philippines
Senior Member

@BlackSwan,

Sir, I attached your needed files. I just combined it as 1 file.

Thanks for your help.
Re: large table [message #675590 is a reply to message #675584] Mon, 08 April 2019 03:33 Go to previous messageGo to next message
reym21
Messages: 241
Registered: February 2010
Location: Philippines
Senior Member

@John Watson,

Sir, this is a good feedback and basis.

Thanks & best regards,

[Updated on: Mon, 08 April 2019 03:34]

Report message to a moderator

Re: large table [message #675591 is a reply to message #675590] Mon, 08 April 2019 04:04 Go to previous messageGo to next message
cookiemonster
Messages: 13917
Registered: September 2008
Location: Rainy Manchester
Senior Member
It's generally better if you post stuff directly in [code] tags rather than attaching a file.
Not everyone can/will open files.

Never the less I've had a look and what you've got in the file doesn't make sense.
You've got an explain plan that is supposedly for
select count(*) from user_ledger_trans

But it can't possibly be since user_ledger_trans appears multiple times in the explain plan along with a lot of other stuff.
That explain plan is for some much more complex query.

According to the tkprof output the select count(*) is taking 48 seconds.

An explain plan for the actual query would help.
Re: large table [message #675593 is a reply to message #675589] Mon, 08 April 2019 04:30 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
Having seen the DDL, the quickest way to run
select count(*) from user_ledger_trans;
(if that really is the query you want to run) would likely be an index fast full scan of abc.USER_LEDGER_TRANS_N2 or abc.USER_LEDGER_TRANS_N1 If the CBO isn't doing that, what happens if you hint it? If the index FFS is still not quick enough, the only other possibility I can think of is to throw parallel query at it.
Re: large table [message #675594 is a reply to message #675593] Mon, 08 April 2019 04:33 Go to previous messageGo to next message
cookiemonster
Messages: 13917
Registered: September 2008
Location: Rainy Manchester
Senior Member
Do hints even exist in oracle 7?
And is it using the CBO? Could well be using the RBO.
Re: large table [message #675596 is a reply to message #675594] Mon, 08 April 2019 04:39 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
Hints were there. However, you may well be right about it being set to use the RBO.

@reym21, you need to be sure that you have optimizer_goal=all_rows and that you have analyzed all your objects.
Re: large table [message #675598 is a reply to message #675593] Mon, 08 April 2019 08:03 Go to previous messageGo to next message
JPBoileau
Messages: 88
Registered: September 2017
Member
I'll second the INDEX_FFS hint. I'd be cautious with using parallel, especially if this is used in an OLTP environment.

I wasn't able to open the attached text file (not authorized error, even though I'm logged in).

Oracle 7.3.x has a fairly comprehensive set of hints that can be used. You can look it up here: https://docs.oracle.com/pdf/A48506_1.pdf

JP
Re: large table [message #675626 is a reply to message #675535] Wed, 10 April 2019 08:46 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
Is installed Oracle 32-bit or 64-bit software?
Re: large table [message #675630 is a reply to message #675626] Wed, 10 April 2019 14:20 Go to previous messageGo to next message
ZeeshanAwan
Messages: 2
Registered: November 2017
Junior Member
my question of out of topic I am sorry.

Is there were any recent change in hardware/Server/RAM? or network in your environment?
Re: large table [message #675707 is a reply to message #675626] Mon, 15 April 2019 00:56 Go to previous message
reym21
Messages: 241
Registered: February 2010
Location: Philippines
Senior Member

@BlackSwan,

Sir, this is a 32-bit database.

Thanks.
Previous Topic: automatic restart of teh instance
Next Topic: invalid datafile
Goto Forum:
  


Current Time: Thu Mar 28 17:01:12 CDT 2024