Skip to content

oracle streams – reference article for the new oracle dba

you are a new oracle dba and you would like to know what oracle streams is.

Oracle Streams



Oracle Streams enables the propagation and management of data, transactions and events in a data stream either within a database, or from one database to another. The stream routes published information to subscribed destinations. The result is a new feature that provides greater functionality and flexibility than traditional solutions for capturing and managing events, and sharing the events with other databases and applications. As users’ needs change, they can simply implement a new capability of Oracle Streams, without sacrificing existing capabilities.


Oracle Streams provides a set of elements that allow users to control what information is put into a stream, how the stream flows or is routed from node to node, what happens to events in the stream as they flow into each node, and how the stream terminates. By specifying the configuration of the elements acting on the stream, a user can address specific requirements.

The architecture of Oracle Streams is very flexible. As shown in the above diagram, Streams contains three basic elements.

  • Capture
  • Staging
  • Consumption


Oracle Streams supports capture of events (database changes, and application generated messages) into the staging area. These events are captured in two ways. With implicit capture, the server captures DML and DDL events at a source database. Explicit capture allows applications to explicitly generate events and place them in the staging area.

With implicit capture, the capture process retrieves change data extracted from the redo log, either by hot mining the online redo log or, if necessary, by mining archived log files. After retrieving the data, the capture process formats it into a Logical Change Record (LCR) and places it in a staging area for further processing. The capture process can intelligently filter LCRs based upon defined rules. Thus, only changes to desired objects are captured.

User applications can explicitly enqueue user messages representing events into the staging area. These messages can be formatted as LCRs, which will allow them to be consumed by the apply engine, or they can be formatted for consumption by another user application.


Once captured, events are placed in a staging area. The staging area is a queue that provides a service to store and manage captured events. Staging provides a holding area with security, as well as auditing and tracking of LCR data.

Subscribers examine the contents of the staging area and determine whether or not they have an interest in an event. A subscriber can either be a user application, another staging area, usually on another system, or the default apply process.


If the subscriber is another staging area, the event is propagated to the other staging area, either within the same database or in a remote database, as appropriate. To simplify network routing and reduce WAN traffic, events need not be sent to all databases and applications. Rather, they can be directed through staging areas on one or more systems until they reach the subscribing system. For example, an event may propagate via a hub database that does not actually apply the event. A single staging area can stage events from multiple databases, simplifying setup and configuration.


Events in a staging area are consumed by the apply engine, where the changes they represent are applied to a database, or they are consumed by an application. Oracle Streams includes a flexible apply engine, that allows use of a standard or custom apply function. This enables data to be transformed when necessary. Support for explicit dequeue allows application developers to use Oracle Streams to notify applications of changes to data, while still leveraging the change capture and propagation features of Oracle Streams.

Default Apply

The default apply engine applies DML changes and DDL changes represented by implicitly or explicitly captured LCRs. The default apply engine will detect conflicts where the destination row has been changed and does not contain the expected values. If a conflict is detected, then a resolution routine may be invoked.

User-Defined Function Apply

The apply engine can pass the LCR or a user message to a user-defined function. This provides the greatest amount of flexibility in processing an event. A typical application of a user-defined function would be to reformat the data represented by the LCR before applying it to a local table, for example, field format, object name and column name mapping transformations. A user-defined function could also be used to perform column subsetting, or to update other objects that may not be present in the source database.

Explicit Dequeue

User applications can explicitly dequeue LCRs or user messages from the receiving staging area. This allows a user application to efficiently access the data in a Streams’ staging are. Streams can send notifications to registered PL/SQL or OCI functions, giving the applications an alternative to polling for new messages. Of course, applications can still poll, or even wait, for new subscribed messages in the staging area to become available.


Streams lets users control which information to share and where to send it by specifying rules. At the highest level, users can indicate if they want to capture, propagate or apply changes at the table, schema, or global (database) level. For more complex requirements, for example, to apply only a particular subset of data at a given location, users can specify a rule condition similar to the condition in the WHERE clause of a SQL query. If necessary, related rules can be grouped into rule sets.


A transformation is a change in the form of an object participating in capture and apply or a change in the data it holds. Transformations can include changing the datatype representation of a particular column in a table at a particular site, adding a column to a table at one site only, or including a subset of the data in a table at a particular site.

A transformation can be specified during enqueue, to transform the message to the correct type before inserting it into the staging area. It can also be specified for propagation, which may be useful for subsetting data before it is sent to a remote site. Finally, it can be specified at dequeue or local apply, which can be useful for formatting a message in a manner appropriate for a specific destination.

Heterogeneous Support

Oracle Streams is an open information sharing solution. Each element supports industry standard languages and standards. Streams supports capture and apply from Oracle to non-Oracle systems. Changes can be applied to a non-Oracle system via a transparent gateway or generic connectivity. Streams also includes an API to allow non-Oracle data sources to easily submit or receive change records, allowing for heterogeneous data movement in both directions. In addition, messages can be sent to and received from other message queuing systems such as MQ Series and Tibco via the Message Gateway.


Several tools are available for configuring, administering, and monitoring a Streams environment. The primary interface to Streams is a collection of Oracle-supplied PL/SQL packages. To help users configure, administer, and monitor their Streams environments, Oracle provides a Streams tool in the Oracle Enterprise Manager Console. Users can also use the Streams tool to generate Streams configuration scripts, which they can then modify and run to configure their Streams environment. Additionally, Streams data dictionary views keep users informed about their Streams environment.

Deployment for Specific Markets

Oracle Streams satisfies customer’s information sharing requirements in a variety of markets. For example, customers can use Oracle Streams to create Event Notification, Replication, and Data Warehouse Loading solutions. Oracle also provides features built on Streams that extend its capabilities for tasks. Advanced Queuing is built on Oracle Streams and provides robust message queuing functionality integrated with the Oracle9i Database. Of course, all customers can utilize the full power of Oracle Streams, and create configurations that seemingly span multiple markets, enabling new classes of applications. In addition, all deployments and their associated meta-data are compatible. For example, a replication installation can easily be extended to load a data warehouse or enable bi-directional replication–a complete reconfiguration is not required.

Oracle Streams–Single, Unified Solution

Oracle Streams satisfies the most demanding information sharing requirements using a common infrastructure. Complex distributed environments benefit from a single solution that satisfies their information sharing requirements. As an organization grows, developers and administrators can be confident that Oracle Streams has the flexibility to meet their changing requirements.

Single, Unified Solution

  • Satisfies all data sharing needs with a single solution.
  • Supports deployment of a variety of configurations:
    • Replication
    • Message Queuing
    • Data Warehouse Loading
  • Allows data to be transparently shared between both Oracle and non-Oracle data stores.

Integrated Feature of Oracle9i Database

  • No additional software to install. No special commands to learn.
  • Takes advantage of reliability and security provided with Oracle9i Database.


  • Provides maximum flexibility for configuration and administration with Oracle-supplied PL/SQL packages.
  • Provides wizards and monitoring capabilities for ease of use with Streams tool in Oracle Enterprise Manager.
  • Keeps users informed about their environment through Data dictionary views.



  • Capture
    • Implicitly captures both DML as well as DDL changes.
    • Filters captured changes based on user-defined rules.
    • Lets applications explicitly enqueue user messages into staging area.
  • Staging/Propagation
    • Provides secure holding area with auditing and tracking of captured data.
    • Allows data to be routed through staging areas before being applied.
    • Supports transformation of data as it is enqueued into or dequeued from the staging area, or prior to propagation.
  • Consumption
    • Automatically detects and resolves conflicts before applying changes using default apply mechanism.
    • Gives user complete control over how data is modified and applied with user-defined apply mechanism.
    • Allows applications to directly access data in Stream’s staging area using explicit dequeue.


  • Hot mining of the online redo log reduces the latency of data capture.
  • Parallel capture and apply processes ensure maximum throughput for concurrent events.



VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Post a Comment

You must be logged in to post a comment.