Search Results

Search found 64715 results on 2589 pages for 'spring data rest'.

Page 29/2589 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • Using a REST API and iPhone/Objective-C

    - by Neil Desai
    So I'm brand new to Netflix's API and have never used an API ever before. I'm ok with Objective-C and Cocoa Touch but just have no clue where to start when accessing the API and how to in general. Can someone help me get started with some code that will access titles in Netflix or just how to access a REST API in general with authentication. Thanks. Update: I've looked at the documents and I'm still a little lost because the Netflix API is a little weird with OAuth. Any help?

    Read the article

  • Framework for Implementing REST web service in Django

    - by Laizer
    I'm looking to implement a RESTful interface for a Django application. It is primarily a data-service application - the interface will be (at this point) read-only. The question is which Django toolsets / frameworks make the most sense for this task. I see Django-rest and Django-piston. There's also the option of rolling my own. The question was asked here, but a good two years back. I'd like to know what the current state of play is. In this question, circa 2008, the strong majority vote was to not use any framework at all - just create Django views that reply with e.g. JSON. (The question was also addressed, crica 2008, here.) In the current landscape, what makes the most sense?

    Read the article

  • Unknown error when submit a REST request to Liferay json API

    - by r.rodriguez
    I'm writing an script in Python to automatically update the structures in my Liferay portal and I want to do it via the json REST API. I make a request to get an structure (method getStructure), and it worked. But when I try to do an structure update in the portal it shows me the following error: ValueError: Content-Length should be specified for iterable data of type class 'dict' {'serviceContext': "{'prueba'}", 'serviceClassName': 'com.liferay.portlet.journal.service.JournalStructureServiceUtil', 'name': 'FOO', 'xsd': '... THE XSD OBTAINED VIA JSON ...', 'serviceParameters': '[groupId,structureId,parentStructureId,name,description,xsd,serviceContext]', 'description': 'FOO Structure', 'serviceMethodName': 'updateStructure', 'groupId': '10133'} What I'm doing is the next: urllib.request.Request(url = URL, data = data_update, headers = headers) URL is http://localhost:8080/tunnel-web/secure/json The headers are configured with basic authentication (it works, it is tested with the getStructure method). Data is: data_update = { "serviceClassName" : "com.liferay.portlet.journal.service.JournalStructureServiceUtil", "serviceMethodName" : "updateStructure", "serviceParameters" : "[groupId,structureId,parentStructureId,name,description,xsd,serviceContext]", "groupId" : 10133, "name" : FOO, "description" : FOO Structure, "xsd" : ... THE XSD OBTAINED VIA JSON ..., "serviceContext" : "{}" } Does anybody know the solution? Have I to specify the length for the dictionary and how? Or this is a bug?

    Read the article

  • Accessing protected REST endpoint with JQuery

    - by Andy
    I have a site where members login to their account (FormsAuth). I would like to set up a RESTful service that I can access using jQuery. I would like to protect these services using the same FormsAuth. How would a third-party site be able to access these services? They would need to pass in the Principal/Identity to the service, right? I've only seen examples of Basic Authentication (which Twitter uses and jQuery supports). I'm very new to WCT/REST, so not sure how this should be done.

    Read the article

  • WCF REST based services authentication schemes

    - by FlySwat
    I have a simple authentication scheme for a set of semi-public REST API's we are building: /-----------------------\ | Client POST's ID/Pass | | to an Auth Service | \-----------------------/ [Client] ------------POST----------------------> [Service/Authenticate] | /-------------------------------\ | Service checks credentials | [Client] <---------Session Cookie------- | and generates a session token | | | in a cookie. | | \-------------------------------/ | [Client] -----------GET /w Cookie -------------> [Service/Something] | /----------------------------------\ | Client must pass session cookie | | with each API request | | or will get a 401. | \----------------------------------/ This works well, because the client never needs to do anything except receive a cookie, and then pass it along. For browser applications, this happens automatically by the browser, for non browser applications, it is pretty trivial to save the cookie and send it with each request. However, I have not figured out a good approach for doing the initial handshake from browser applications. For example, if this is all happening using a AJAX technique, what prevents the user from being able to access the ID/Pass the client is using to handshake with the service? It seem's like this is the only stumbling block to this approach and I'm stumped.

    Read the article

  • Haskell as REST server

    - by Dev er dev
    I would like to try Haskell on a smallish project which should be well suited to it. I would like to use it as a backend to a small ajax application. Haskell backend should be able to do authentication (basic, form, whatever, ...), keep track of user session (not much data there except for username) and to dispatch request to handlers based on uri and request type. It should also be able to serialize response to both xml and json format, depending on request parameter. I suppose the handlers are ideally suited for Haskell, since the service is basically stateless, but I don't know where to start for the rest of the story. Searching hackage didn't give me much hints.

    Read the article

  • How to serialize Java primitives using Jersey REST

    - by Olvagor
    In my application I use Jersey REST to serialize complex objects. This works quite fine. But there are a few method which simply return an int or boolean. Jersey can't handle primitive types (to my knowledge), probably because they're no annotated and Jersey has no default annotation for them. I worked around that by creating complex types like a RestBoolean or RestInteger, which simply hold an int or boolean value and have the appropriate annotations. Isn't there an easier way than writing these container objects?

    Read the article

  • REST API unauthenticated requests exception based on the User-Agent

    - by Shay Tsadok
    Hi All, I am developing a REST API that supports two kinds of authentication protocols: login form authentication - for browser based clients. Simple Basic authentication - for non-browser clients. I developed a flow in which unauthenticated requests redirected to the "login form", the problem is that this is an undesired behavior for non-borwser clients! I thought to solve it by decide according to the "User-Agent" what to do: browsers will be redirected to the "login form" and non-browser clients will get the standard 401:Basic Authentication. A. What do you think about this solution? B. Is there a standard way in Java to check if the request came from browser, or do i need to develop this kind of mechanism by my own? Thanks in advance!

    Read the article

  • ASP.net error message when using REST starter kit

    - by jonhobbs
    Hi all, I've written some code using the REST starter kit and it works fine on my development machine. However, when I upload it to our server the page gives me the following error message... CS1684: Warning as Error: Reference to type 'System.Runtime.Serialization.Json.DataContractJsonSerializer' claims it is defined in 'c:\WINNT\assembly\GAC_MSIL\System.ServiceModel.Web\3.5.0.0__31bf3856ad364e35\System.ServiceModel.Web.dll', but it could not be found I've removed code line by line and it appears that the following line of code is triggering the error... HttpContent newOrganizationContent = HttpContentExtensions.CreateXmlSerializable(newOrganizationXml); Really haven't got a clue how to fix it. I assumed it might be because it needs a newer version of the framework to run, but looking in IIS it says it's running version 2.0.50727 which I think is the lates version because it says that even when we're using framework 3.5 Very confused, any ideas? Jon

    Read the article

  • rails, rest, render different action with responds to

    - by Sam
    Maybe my logic is not restful or know if this is how you would do it but this is what I am trying to do. I'm getting a category inside a category controller and then once I get that category I want to return to an index page in a different controller but keep that @category and the Category.busineses. Before rest I would have just done this: render :controller = "businesses" and it would have rendered the view of the index action in that controller. now in my respond_to block I have this format.html {redirect_to(business_path)} # index.html.erb format.xml { render :xml => @businesses } but of course with a render it looses the instance variable and starts with a new action. So what I want to do is render the action instead of redirecting to that action. is this possible?

    Read the article

  • Drupal rest services in Json format

    - by Sreekanth Chandrabhatla
    Amworking on drupal 7. Created a rest services in drupal views. I want to consume this service in my android app. When i try to view my service http://mysite.com/ubercart/?q=doctor am getting response like this [{,"foaf:Document"],"title":{"predicates":["dc:title"]},"created":{"predicates":["dc:date","dc:created"],"datatype":"xsd:dateTime","callback":"date_iso8601"},"changed":{"predicates":["dc:modified"],"datatype":"xsd:dateTime","callback":"date_iso8601"},"body":{"predicates":[""vid":"12","uid":"1","title":"raja","log":"","status":"1","comment":"2","promote":"0","sticky":"0","nid":"12","type":"doctor","language":"und","created":"1351849158","changed":"1351849158","tnid":"0","translate":"0","revision_timestamp":"1351849158","revision_uid":"1","field_rating":{"und":[{"value":"4"}]},"field_place":{"und":[{"value":"Guntur","format":null,"safe_value":"Guntur"}]},"rdf_mapping":{"rdftype":["sioc:Item","foaf:Document"],"title":{"predicates":["dc:title"]},"created":{"predicates":["dc:date","dc:created"],"datatype":"xsd:dateTime","callback":"date_iso8601"},"changed":{"predicates":["dc:modified"],"datatype":"xsd:dateTime","callback":"date_iso8601"},"body":{"predicates":["content:encoded"]},"uid":{"predicates":["sioc:has_creator"],"type":"rel"},"name":{"predicates":["foaf:name"]},"comment_count":{"predicates":["sioc:num_replies"],"datatype":"xsd:integer"},"last_activity":{"predicates":["sioc:last_activity_date"],"datatype":"xsd:dateTime","callback":"date_iso8601"}},"cid":"0","last_comment_timestamp":"1351849158","last_comment_name":null,"last_comment_uid":"1","comment_count":"0","name":"admin","picture":"0","data":"b:0;","uc_order_product_id":false,"ucnc_product_nid":false},{"vid":"11","uid":"1","title":"ravi","log":"","status":"1","comment":"2","promote":"0","sticky":"0","nid":"11","type":"doctor","language":"und","created":"1351849131","changed":"1351849131","tnid":"0","translate":"0","revision_timestamp":"1351849131","revision_uid":"1","field_rating":{"und":[{"value":"5"}]},"field_place":{"und":[{"value":"Hyderabad","format":null,"safe_value":"Hyderabad"}]},"rdf_mapping":{"rdftype":["sioc:Item","foaf:Document"],"title":{"predicates":["dc:title"]},"created":{"predicates":["dc:date","dc:created"],"datatype":"xsd:dateTime","callback":"date_iso8601"},"changed":{"predicates":["dc:modified"],"datatype":"xsd:dateTime","callback":"date_iso8601"},"body":{"predicates":["content:encoded"]},"uid":{"predicates":["sioc:has_creator"],"type":"rel"},"name":{"predicates":["foaf:name"]},"comment_count":{"predicates":["sioc:num_replies"],"datatype":"xsd:integer"},"last_activity":{"predicates":["sioc:last_activity_date"],"datatype":"xsd:dateTime","callback":"date_iso8601"}},"cid":"0","last_comment_timestamp":"1351849131","last_comment_name":null,"last_comment_uid":"1","comment_count":"0","name":"admin","picture":"0","data":"b:0;","uc_order_product_id":false,"ucnc_product_nid":false},{"vid":"10","uid":"1","title":"sree","log":"","status":"1","comment":"2","promote":"0","sticky":"0","nid":"10","type":"doctor","language":"und","created":"1351849109","changed":"1351849109","tnid":"0","translate":"0","revision_timestamp":"1351849109","revision_uid":"1","field_rating":{"und":[{"value":"4"}]},"field_place":{"und":[{"value":"Hyderabad","format":null,"safe_value":"Hyderabad"}]},"rdf_mapping":{"rdftype":["sioc:Item"content:encoded"]},"uid":{"predicates":["sioc:has_creator"],"type":"rel"},"name":{"predicates":["foaf:name"]},"comment_count":{"predicates":["sioc:num_replies"],"datatype":"xsd:integer"},"last_activity":{"predicates":["sioc:last_activity_date"],"datatype":"xsd:dateTime","callback":"date_iso8601"}},"cid":"0","last_comment_timestamp":"1351849109","last_comment_name":null,"last_comment_uid":"1","comment_count":"0","name":"admin","picture":"0","data":"b:0;","uc_order_product_id":false,"ucnc_product_nid":false}] Actually i need response like this {"nodes":{"0":{"node":{"title":"raja","field_place":"Guntur","rating":"4"}},"1":{"node":{"title":"ravi","field_place":"Hyderabad","rating":"5"}},"2":{"node":{"title":"sree","field_place":"Hyderabad","rating":"4"}}}} Do any one can me out?

    Read the article

  • Creating an event-invite using the Old Rest Api

    - by Sven Koluem
    Hi, because there is no way doing it with the GraphApi, i try to do it with the old REST Api, but without any success.. no error msg. but also no invite. $restApi = $facebook->api(array( 'method' => 'events.invite', 'eid' => $eid, 'uids' => $testuserId, 'personal_message' => 'testing', 'access_token' => $accesstoken, )); print '<pre>' . print_r($restApi, true) . '</pre>'; Or maybe some of you, knowing a better way.. thx sven

    Read the article

  • JSONP Implications with true REST

    - by REA_ANDREW
    From my understanding JSONP can only be achieved using the GET verb. Assuming this is true which I think it is, then this rules out core compliance with true REST in which you should make use of different verbs i.e. GET,PUT,POST,DELETE etc... for different and specific purposes. My question is what type of barriers am I likely to come up against if I where to say allow updating and deleting of resources using a JSONP service using a get request. Is it better practice to offer a JSON service and state that the user will need a server side proxy to consume using JavaScript XDomain? Cheers , Andrew

    Read the article

  • WCF REST Does Not Contain All of the Relative File Path

    - by Brandon
    I have a RESTful WCF 3.5 endpoint as such: System.Security.User.svc This is supposed to represent the namespace of the User class and is desired behavior by our client. I have another endpoint I created for testing called: Echo.svc I am writing an overridden IHttpModule and in my module, I follow what almost everyone does by doing: string path = HttpContext.Current.Request.AppRelativeCurrentExecutionFilePath; If I make a call to: http://localhost/services/Echo/test My path variable has a value of '~/echo/test' However, when I make a call to: http://localhost/services/System.Security.User/test My path variable has a value of '~/system.security.user' In my 2nd situation, it is stripping off the '/test' on the end of any endpoint that contains multiple periods. This is undesired behavior and the only solution I have found to fixing this is some ugly string manipulation using the property which does contain the complete URL path: string rawPath = HttpContext.Current.Request.RawUrl; This returns '/services/system.security.user/test'. Does anyone know why my first situation does not return the rest of the URL path for endpoints that contain multiple periods in the name?

    Read the article

  • Wrap MemoryStream in a using block REST WCF

    - by Mark
    I'm learning REST to implement some Services with WCF. I implemented an example with a MemoryStream. Because MemoryStream is Disposable I wrapped it in a using. When I do this I sometimes can see the xml response in the browser (IE8) and sometimes it will just show me the following Errormessage: The download of the specified resource has failed. Error processing resource 'http://localhost:8889/SimpleGetService/'. Why does this occur? When I don't wrap it in a using, I never seem to get the error.

    Read the article

  • data structure for counting frequencies in a database table-like format

    - by user373312
    i was wondering if there is a data structure optimized to count frequencies against data that is stored in a database table-like format. for example, the data comes in a (comma) delimited format below. col1, col2, col3 x, a, green x, b, blue ... y, c, green now i simply want to count the frequency of col1=x or col1=x and col2=green. i have been storing the data in a database table, but in my profiling and from empirical observation, database connection is the bottle-neck. i have tried using in-memory database solutions too, and that works quite well; the only problem is memory requirements and quirky init/destroy calls. also, i work mainly with java, but have experience with .net, and was wondering if there was any api to work with "tabular" data in a linq way using java. any help is appreciated.

    Read the article

  • Phase REST XML into variable

    - by 001
    I want to get a response from the REST XML web service, and phase it into variables so I can use them in my program. 1) How come this code does not work? I get an empty string... // Get response string ws_response=""; using (HttpWebResponse response = request.GetResponse() as HttpWebResponse) { // Get the response stream StreamReader reader = new StreamReader(response.GetResponseStream()); // web service response string ws_response = reader.ReadToEnd; // <---???? I get an empty string // do phasing here (ie XML element into variable) etc.. // }

    Read the article

  • Consume RESt API from .NET

    - by Ajish
    Hi All, I am trying to consume REST API from my .NET Application. This API's are all written in JAVA. I am asked to pass the authentication credentials vis HTTP headers. How can I pass these authentication credentials like 'DATE', 'AUTHORIZATION' and 'Accept' via HTTP headers. Which class in .NET can I use to accomplish this task. Can anyone help me with this? All your help will be appreciated. Ajish.

    Read the article

  • How to model file system operations with REST?

    - by massive
    There are obvious counterparts for some of file systems' basic operations (eg. ls and rm), but how would you implement not straightforwardly RESTful actions such as cp or mv? As answers to the question REST services - exposing non-data “actions” suggest, the preferred way of implementing cp would include GETting the resource, DELETing it and PUTting it back again with a new name. But what if I would need to do it efficiently? For instance, if the resource's size would be huge? How would I eliminate the superfluous transmission of resource's payload to client and back to the originating server? Here is an illustration. I have a resource: /videos/my_videos/2-gigabyte-video.avi and I want copy it into a new resource: /videos/johns_videos/copied-2-gigabyte-video.avi How would I implement the copy, move or other file system actions the RESTful way? Or is there even a proper way? Am I doing it all wrong?

    Read the article

  • @Path and regular expression (Jersey/REST)

    - by Castanho
    Hi there! I'm using Jersey in a REST project and I'm needing to use regular expression. Digging about it is simple like: @Path("/resources) public class MyResource { @GET @Path("{subResources:.*}/bar") public String get() {...} } But, I'm only capable of using regex if in my @Path contains a variable or text value, example: @Path("{SubResource1}/{subResources:.*}/bar") Or @Path("hardCodeString/{subResources:.*}/bar") Today I could run with this solution of a variable, but is not oK for my perspective. Question Does anyone have worked with something related? I'm doing something wrong? I think that this could be a bug, when working with more then one @Path, one in the Class and other in the Method. Any tips is appreciated! Regards

    Read the article

  • Oracle Big Data Software Downloads

    - by Mike.Hallett(at)Oracle-BI&EPM
    Companies have been making business decisions for decades based on transactional data stored in relational databases. Beyond that critical data, is a potential treasure trove of less structured data: weblogs, social media, email, sensors, and photographs that can be mined for useful information. Oracle offers a broad integrated portfolio of products to help you acquire and organize these diverse data sources and analyze them alongside your existing data to find new insights and capitalize on hidden relationships. Oracle Big Data Connectors Downloads here, includes: Oracle SQL Connector for Hadoop Distributed File System Release 2.1.0 Oracle Loader for Hadoop Release 2.1.0 Oracle Data Integrator Companion 11g Oracle R Connector for Hadoop v 2.1 Oracle Big Data Documentation The Oracle Big Data solution offers an integrated portfolio of products to help you organize and analyze your diverse data sources alongside your existing data to find new insights and capitalize on hidden relationships. Oracle Big Data, Release 2.2.0 - E41604_01 zip (27.4 MB) Integrated Software and Big Data Connectors User's Guide HTML PDF Oracle Data Integrator (ODI) Application Adapter for Hadoop Apache Hadoop is designed to handle and process data that is typically from data sources that are non-relational and data volumes that are beyond what is handled by relational databases. Typical processing in Hadoop includes data validation and transformations that are programmed as MapReduce jobs. Designing and implementing a MapReduce job usually requires expert programming knowledge. However, when you use Oracle Data Integrator with the Application Adapter for Hadoop, you do not need to write MapReduce jobs. Oracle Data Integrator uses Hive and the Hive Query Language (HiveQL), a SQL-like language for implementing MapReduce jobs. Employing familiar and easy-to-use tools and pre-configured knowledge modules (KMs), the application adapter provides the following capabilities: Loading data into Hadoop from the local file system and HDFS Performing validation and transformation of data within Hadoop Loading processed data from Hadoop to an Oracle database for further processing and generating reports Oracle Database Loader for Hadoop Oracle Loader for Hadoop is an efficient and high-performance loader for fast movement of data from a Hadoop cluster into a table in an Oracle database. It pre-partitions the data if necessary and transforms it into a database-ready format. Oracle Loader for Hadoop is a Java MapReduce application that balances the data across reducers to help maximize performance. Oracle R Connector for Hadoop Oracle R Connector for Hadoop is a collection of R packages that provide: Interfaces to work with Hive tables, the Apache Hadoop compute infrastructure, the local R environment, and Oracle database tables Predictive analytic techniques, written in R or Java as Hadoop MapReduce jobs, that can be applied to data in HDFS files You install and load this package as you would any other R package. Using simple R functions, you can perform tasks such as: Access and transform HDFS data using a Hive-enabled transparency layer Use the R language for writing mappers and reducers Copy data between R memory, the local file system, HDFS, Hive, and Oracle databases Schedule R programs to execute as Hadoop MapReduce jobs and return the results to any of those locations Oracle SQL Connector for Hadoop Distributed File System Using Oracle SQL Connector for HDFS, you can use an Oracle Database to access and analyze data residing in Hadoop in these formats: Data Pump files in HDFS Delimited text files in HDFS Hive tables For other file formats, such as JSON files, you can stage the input in Hive tables before using Oracle SQL Connector for HDFS. Oracle SQL Connector for HDFS uses external tables to provide Oracle Database with read access to Hive tables, and to delimited text files and Data Pump files in HDFS. Related Documentation Cloudera's Distribution Including Apache Hadoop Library HTML Oracle R Enterprise HTML Oracle NoSQL Database HTML Recent Blog Posts Big Data Appliance vs. DIY Price Comparison Big Data: Architecture Overview Big Data: Achieve the Impossible in Real-Time Big Data: Vertical Behavioral Analytics Big Data: In-Memory MapReduce Flume and Hive for Log Analytics Building Workflows in Oozie

    Read the article

  • The Ins and Outs of Effective Smart Grid Data Management

    - by caroline.yu
    Oracle Utilities and Accenture recently sponsored a one-hour Web cast entitled, "The Ins and Outs of Effective Smart Grid Data Management." Oracle and Accenture created this Web cast to help utilities better understand the types of data collected over smart grid networks and the issues associated with mapping out a coherent information management strategy. The Web cast also addressed important points that utilities must consider with the imminent flood of data that both present and next-generation smart grid components will generate. The three speakers, including Oracle Utilities' Brad Williams, focused on the key factors associated with taking the millions of data points captured in real time and implementing the strategies, frameworks and technologies that enable utilities to process, store, analyze, visualize, integrate, transport and transform data into the information required to deliver targeted business benefits. The Web cast replay is available here. The Web cast slides are available here.

    Read the article

  • What Works in Data Integration?

    - by dain.hansen
    TDWI just recently put out this paper on "What Works in Data Integration". I invite you especially to take a look at the section on "Accelerating your Business with Real-time Data Integration" and the DIRECTV case study. The article discusses some of the technology considerations for BI/DW and how data integration plays a role to deliver timely, accessible, and high-quality data. It goes on to outline the three key requirements for how to deliver high performance, low impact, and reliability and how that can translate to faster results. The DIRECTV webinar is something you definitely want to take a look at, you'll hear how DIRECTV successfully transformed their data warehouse investments into a competitive advantage with Oracle GoldenGate.

    Read the article

  • Are there sources of email marketing data available?

    - by Gortron
    Are sources of email marketing data available to the public? I would like to see email marketing data to see what kind of content a business sends out, the frequency of sending, the number of people emailed, especially the resulting open rates and click through rates. Are businesses willing to share data on their previous email marketing campaigns without divulging their contact list? I would like to use this data to create an application to help businesses create better newsletters by using this data as a benchmark, basically sharing what works and what doesn't for each industry.

    Read the article

  • How to Achieve Real-Time Data Protection and Availabilty....For Real

    - by JoeMeeks
    There is a class of business and mission critical applications where downtime or data loss have substantial negative impact on revenue, customer service, reputation, cost, etc. Because the Oracle Database is used extensively to provide reliable performance and availability for this class of application, it also provides an integrated set of capabilities for real-time data protection and availability. Active Data Guard, depicted in the figure below, is the cornerstone for accomplishing these objectives because it provides the absolute best real-time data protection and availability for the Oracle Database. This is a bold statement, but it is supported by the facts. It isn’t so much that alternative solutions are bad, it’s just that their architectures prevent them from achieving the same levels of data protection, availability, simplicity, and asset utilization provided by Active Data Guard. Let’s explore further. Backups are the most popular method used to protect data and are an essential best practice for every database. Not surprisingly, Oracle Recovery Manager (RMAN) is one of the most commonly used features of the Oracle Database. But comparing Active Data Guard to backups is like comparing apples to motorcycles. Active Data Guard uses a hot (open read-only), synchronized copy of the production database to provide real-time data protection and HA. In contrast, a restore from backup takes time and often has many moving parts - people, processes, software and systems – that can create a level of uncertainty during an outage that critical applications can’t afford. This is why backups play a secondary role for your most critical databases by complementing real-time solutions that can provide both data protection and availability. Before Data Guard, enterprises used storage remote-mirroring for real-time data protection and availability. Remote-mirroring is a sophisticated storage technology promoted as a generic infrastructure solution that makes a simple promise – whatever is written to a primary volume will also be written to the mirrored volume at a remote site. Keeping this promise is also what causes data loss and downtime when the data written to primary volumes is corrupt – the same corruption is faithfully mirrored to the remote volume making both copies unusable. This happens because remote-mirroring is a generic process. It has no  intrinsic knowledge of Oracle data structures to enable advanced protection, nor can it perform independent Oracle validation BEFORE changes are applied to the remote copy. There is also nothing to prevent human error (e.g. a storage admin accidentally deleting critical files) from also impacting the remote mirrored copy. Remote-mirroring tricks users by creating a false impression that there are two separate copies of the Oracle Database. In truth; while remote-mirroring maintains two copies of the data on different volumes, both are part of a single closely coupled system. Not only will remote-mirroring propagate corruptions and administrative errors, but the changes applied to the mirrored volume are a result of the same Oracle code path that applied the change to the source volume. There is no isolation, either from a storage mirroring perspective or from an Oracle software perspective.  Bottom line, storage remote-mirroring lacks both the smarts and isolation level necessary to provide true data protection. Active Data Guard offers much more than storage remote-mirroring when your objective is protecting your enterprise from downtime and data loss. Like remote-mirroring, an Active Data Guard replica is an exact block for block copy of the primary. Unlike remote-mirroring, an Active Data Guard replica is NOT a tightly coupled copy of the source volumes - it is a completely independent Oracle Database. Active Data Guard’s inherent knowledge of Oracle data block and redo structures enables a separate Oracle Database using a different Oracle code path than the primary to use the full complement of Oracle data validation methods before changes are applied to the synchronized copy. These include: physical check sum, logical intra-block checking, lost write validation, and automatic block repair. The figure below illustrates the stark difference between the knowledge that remote-mirroring can discern from an Oracle data block and what Active Data Guard can discern. An Active Data Guard standby also provides a range of additional services enabled by the fact that it is a running Oracle Database - not just a mirrored copy of data files. An Active Data Guard standby database can be open read-only while it is synchronizing with the primary. This enables read-only workloads to be offloaded from the primary system and run on the active standby - boosting performance by utilizing all assets. An Active Data Guard standby can also be used to implement many types of system and database maintenance in rolling fashion. Maintenance and upgrades are first implemented on the standby while production runs unaffected at the primary. After the primary and standby are synchronized and all changes have been validated, the production workload is quickly switched to the standby. The only downtime is the time required for user connections to transfer from one system to the next. These capabilities further expand the expectations of availability offered by a data protection solution beyond what is possible to do using storage remote-mirroring. So don’t be fooled by appearances.  Storage remote-mirroring and Active Data Guard replication may look similar on the surface - but the devil is in the details. Only Active Data Guard has the smarts, the isolation, and the simplicity, to provide the best data protection and availability for the Oracle Database. Stay tuned for future blog posts that dive into the many differences between storage remote-mirroring and Active Data Guard along the dimensions of data protection, data availability, cost, asset utilization and return on investment. For additional information on Active Data Guard, see: Active Data Guard Technical White Paper Active Data Guard vs Storage Remote-Mirroring Active Data Guard Home Page on the Oracle Technology Network

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >