Thursday, May 20, 2010

killing rac sessions

SET LINESIZE 100

COLUMN spid FORMAT A10

COLUMN username FORMAT A10

COLUMN program FORMAT A45



SELECT s.inst_id,

s.sid,

s.serial#,

p.spid,

s.username,

s.program

FROM gv$session s

JOIN gv$process p ON p.addr = s.paddr AND p.inst_id = s.inst_id

WHERE s.type != 'BACKGROUND';


The basic syntax for killing a session is shown below.


SQL> ALTER SYSTEM KILL SESSION 'sid,serial#';



In a RAC environment, you optionally specify the INST_ID, shown when querying the

GV$SESSION view. This allows you to kill a session on different RAC node.


SQL> ALTER SYSTEM KILL SESSION 'sid,serial#@inst_id';


In addition to the syntax described above, you can add the IMMEDIATE clause.


SQL> ALTER SYSTEM KILL SESSION 'sid,serial#' IMMEDIATE;


DISCONNECT SESSION


SQL> ALTER SYSTEM DISCONNECT SESSION 'sid,serial#' POST_TRANSACTION;

SQL> ALTER SYSTEM DISCONNECT SESSION 'sid,serial#' IMMEDIATE;


The POST_TRANSACTION and IMMEDIATE clauses can be used together, but the documentation

states that in this case the IMMEDIATE clause is ignored. In addition, the syntax diagram

suggests both clauses are optional, but in reality, one or both must be specified or you

receive an error.


SQL> alter system disconnect session '30,7';

alter system disconnect session '30,7'

*

ERROR at line 1:

ORA-02000: missing POST_TRANSACTION or IMMEDIATE keyword


The Windows Approach

To kill the session on the Windows operating system, first identify the session, then

substitute the relevant SID and SPID values into the following command issued from the

command line.


C:> orakill ORACLE_SID spid


The UNIX Approach

To kill the session on UNIX or Linux operating systems, first identify the session, then substitute the relevant SPID into the following command.


% kill spid

If after a few minutes the process hasn't stopped, terminate the session using the following.


% kill -9 spid

If in doubt check that the SPID matches the UNIX PROCESSID shown using.


% ps -ef
grep ora


----------------Session Waits--------------------

SQL> select event, seconds_in_wait, sid from v$session_wait where sid in (10,14);
killing rac sessionsSocialTwist Tell-a-Friend

Export and Import Dump

------------Importing Full Dump File-------------




If you want to Import all the objects in a dump file then you can type the

following command.



$impdp hr/hr DUMPFILE=dpump_dir1:expfull.dmp FULL=y

LOGFILE=dpump_dir2:full_imp.log



This example imports everything from the expfull.dmp dump file. In this example,

a DIRECTORY parameter is not provided. Therefore, a directory object must be provided

on both the DUMPFILE parameter and the LOGFILE parameter



-------------Importing Objects of One Schema to another Schema---------------



The following example loads all tables belonging to hr schema to scott schema



$impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp

REMAP_SCHEMA=hr:scott



If SCOTT account exist in the database then hr objects will be loaded into scott schema.

If scott account does not exist, then Import Utility will create the SCOTT account with

an unusable password because, the dump file was exported by the user SYSTEM and imported

by the user SYSTEM who has DBA privileges.



-------------Loading Objects of one Tablespace to another Tablespace--------------------



You can use remap_tablespace option to import objects of one tablespace to another

tablespace by giving the command



$impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp

REMAP_TABLESPACE=users:sales



The above example loads tables, stored in users tablespace, in the sales tablespace.



---------Generating SQL File containing DDL commands using Data Pump Import---------



You can generate SQL file which contains all the DDL commands which Import would have

executed if you actually run Import utility



The following is an example of using the SQLFILE parameter.



$ impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp

SQLFILE=dpump_dir2:expfull.sql



A SQL file named expfull.sql is written to dpump_dir2.



-----------------------Importing objects of only a Particular Schema---------------------



$impdp hr/hr SCHEMAS=hr,oe DIRECTORY=dpump_dir1 LOGFILE=schemas.log

DUMPFILE=expdat.dmp



-----------------Importing Only Particular Tables---------------



The following example shows a simple use of the TABLES parameter to import only the employees and jobs tables from the expfull.dmp file. You can create the expfull.dmp dump file used in this example by running the example provided for the Full Database Export in Previous Topic.



$impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp TABLES=employees,jobs



This will import only employees and jobs tables from the DUMPFILE
Export and Import DumpSocialTwist Tell-a-Friend