Search Results

Search found 108959 results on 4359 pages for 'ado net data services'.

Page 85/4359 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • .NET 3.5 SP1 prerequisite, MS giving the clients 4.0

    - by Matt Bridges
    I have been using an MSI to install a WPF application using the .NET Framework 3.5 SP1. I have set up .NET 3.5 as a prerequisite in the MSI, and what has been happening for ages is that when the user does not have .NET 3.5 SP1, the MSI first has them download and install that before resuming the installation of my application. Since yesterday when MS released .NET 4.0, when users don't have .net 3.5 SP1, the MSI is directing them to install 4.0 instead. What happens though, is that after they finish installing 4.0, the MSI still detects that they don't have 3.5, and directs them to the 4.0 install site again. So the user has 4.0, but the MSI doesn't ever get to installing my application. What do I have to change in my application? This seems like an error with how MS is handling the prerequisites either on their server or in the MSI in VS 2008.

    Read the article

  • Limit the model data fields serialized by Web API based on the return type Interface

    - by Stevo3000
    We're updating our architecture to use a single object model for desktop, web and mobile that can be used in the MVVM pattern. I would like to be able to limit the data fields that are serialized through Web API by using interfaces on the controllers. This is required because the model objects for mobile are stored in HTML5 local storage so don't carry optional data while a thin desktop client would be able to store (and work with) more data. To achieve this a model will implement the different interfaces that define which data fields should be serialized and there will be a controller specific to the interface. The problem is that the Web API always serializes every field in the model even if it is not part of the interface being returned. How can we only serialize fields in the returned interface?

    Read the article

  • receive emails in a .NET service (C#)

    - by Jean Azzopardi
    Hi, this is my first posting on stackoverflow, so don't flame me too much ;) I'm building a service that's monitoring devices and should be able to receive emails from users, parse them and take action accordingly. It should also be able to send emails about the status of the devices, etc. I'll be using Windows.Live email, and as I said, a .NET service that should be able to send/recieve emails. I am wondering what kind of system would I need to cater for receiving the emails, as I already know how to send them via the System.net.Mail API. Thanks. EDIT : Thanks for your comments everybody, I'm looking forward to implementing this system and asking more questions on this rather excellent site.

    Read the article

  • Asp.net (c#) and a JEE 5 webservice

    - by arinte
    We have a asp.net app that talks to a pretty complex JEE 5 web service. Everything works fine except when we throw an exception. We throw a simple exception without any inner/orig exception, but we get this message on the .Net side. Additional XML content is present in the fault detail element. Only a single element is allowed. Works fine with a java client, as in we can get the exception message. What can we do?

    Read the article

  • Insert a Join Statement - (Insert Data to Multiple Tables) - C#/SQL/T-SQL/.NET

    - by peace
    I have a Winform that has fields need to be filled by a user. All the fields doesn't belong to one table, the data will go to Customer table and CustomerPhone table, so i decided to do multiple inserts. I will insert appropriate data to CustomerPhone first then Insert the rest data to Customer table. Is it possible to Join an Insert OR Insert a Join? If show me a rough sample, i will be grateful. Many Thanks

    Read the article

  • best way to communicate between .net and java

    - by manu1001
    I have a .net application which needs to expose a service consumed by a java client. The service can't be public. There should be some authentication mechanism for the client. What is the best way to do this? I'm new to web services and am confused by all the soap, wsdl etc. and have also heard a lot that it'll be a pain to get the two to communicate. Your thoughts?

    Read the article

  • Terminal services and screen savers

    - by Corsair
    I have a Window Server 2008 R2 server acting as a terminal server for mobile devices running a single application, but I am having issues with the screen saver popping on after ten minutes. Is there a setting in terminal services to make the screensaver settings universal? If not, how should I approach this issue?

    Read the article

  • Asp.net web service: Problems accessing without www

    - by MysterM
    I have a Asp.net web service running on www.domain.com/Service.svc that I connect to using jQuery from my asp.net website. Everything works perfect if the user access my website with www.domain.com. But if the user uses only domain.com I get error: There was no channel actively listening at 'http://domain.com/Service.svc/get?date=2010-10-09'. This is often caused by an incorrect address URI. Ensure that the address to which the message is sent matches an address on which a service is listening. In my web.config I use the serviceHostingEnvironment tag to get the service running on my webhost (it doesn't work otherwise). Maybe this is what causes the error? Here is my system.serviceModel in web.config (I had some problems setting up the web service so that's why my web.config might be a bit messy: <system.serviceModel> <behaviors> <endpointBehaviors> <behavior name="ServiceAspNetAjaxBehavior"> <enableWebScript /> </behavior> </endpointBehaviors> <serviceBehaviors> <behavior name="ServiceBehavior"> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> </behaviors> <serviceHostingEnvironment aspNetCompatibilityEnabled="true"> <baseAddressPrefixFilters> <add prefix="http://www.domain.com"/> </baseAddressPrefixFilters> </serviceHostingEnvironment> <services> <service behaviorConfiguration="ServiceBehavior" name="Service"> <endpoint address="" behaviorConfiguration="ServiceAspNetAjaxBehavior" binding="webHttpBinding" bindingConfiguration="ServiceBinding" contract="Service" /> </service> </services> <bindings> <webHttpBinding> <binding name="ServiceBinding" maxBufferPoolSize="1000000" maxReceivedMessageSize="1000000"> <readerQuotas maxDepth="1000000" maxStringContentLength="1000000" maxArrayLength="1000000" maxBytesPerRead="1000000" maxNameTableCharCount="1000000" /> </binding> </webHttpBinding> </bindings> </system.serviceModel> How can I make it possible for my users to access my webservice also when using only domain.com?

    Read the article

  • JSON DATA formatting in WebAPI

    - by user1736299
    public class CalendarController : ApiController { Events[] events = new Events[] { new Events { title= "event1", start = System.DateTime.UtcNow, end = System.DateTime.UtcNow }, new Events { title= "event2", start = System.DateTime.UtcNow, end = System.DateTime.UtcNow }, new Events { title= "event3", start = System.DateTime.UtcNow, end = System.DateTime.UtcNow} }; public IEnumerable<Events> GetAllCalendar() { return events; } The JSON result for the above is [{ "title": "event1", "start": "2012-12-05T22:52:35.6471712Z", "end": "2012-12-05T22:52:35.6471712Z"}, { "title": "event2", "start": "2012-12-05T22:52:35.6471712Z", "end": "2012-12-05T22:52:35.6471712Z"}, { "title": "event3", "start": "2012-12-05T22:52:35.6471712Z", "end": "2012-12-05T22:52:35.6471712Z" }]? How to create the same JSON result without the double quotes but single quote. How to get the date in the format of ‘YYYY-MM-DD HH:MM:SS’ Thank you, Smith

    Read the article

  • import excel to sql db table

    - by droyce
    I am trying to write data from an excel spread sheet to a SQL Database. I have been able to connect to the Excel Spreadsheet and read the data but I am unable to get the data to insert into the SQL DB table. the current code is as follows any help most appreciated. Dim plmExcelCon As System.Data.OleDb.OleDbConnection Dim ldExcelDS As System.Data.DataSet Dim cmdLoadExcel As System.Data.OleDb.OleDbDataAdapter Dim PrmPathExcelFile As String PrmPathExcelFile = txtImportFileLocation.Text.ToString plmExcelCon = New System.Data.OleDb.OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + PrmPathExcelFile + ";Extended Properties=Excel 12.0;") cmdLoadExcel = New System.Data.OleDb.OleDbDataAdapter("select * from [" + txtImportSheetName.Text + "$]", plmExcelCon) ldExcelDS = New System.Data.DataSet cmdLoadExcel.Fill(ldExcelDS) dgvImportData.DataSource = ldExcelDS.Tables(0) plmExcelCon.Close() cmdINSERT.Parameters("@[SQL COLUMN NAME]").Value = [Not sure how to set value from datagrid view] cnLD.Open() cmdINSERT.ExecuteNonQuery() cnLD.Close()

    Read the article

  • Saving ntext data from SQL Server to file directory using asp

    - by April
    A variety of files (pdf, images, etc.) are stored in a ntext field on a MS SQL Server. I am not sure what type is in this field, other than it shows question marks and undefined characters, I am assuming they are binary type. The script is supposed to iterate through the rows and extract and save these files to a temp directory. "filename" and "contenttype" are given, and "data" is whatever is in the ntext field. I have tried several solutions: 1) data.SaveToFile "/temp/"&filename, 2 Error: Object required: '????????????????????' ??? 2) File.WriteAllBytes "/temp/"&filename, data Error: Object required: 'File' I have no idea how to import this, or the Server for MapPath. (Cue: what a noob!) 3) Const adTypeBinary = 1 Const adSaveCreateOverWrite = 2 Dim BinaryStream Set BinaryStream = CreateObject("ADODB.Stream") BinaryStream.Type = adTypeBinary BinaryStream.Open BinaryStream.Write data BinaryStream.SaveToFile "C:\temp\" & filename, adSaveCreateOverWrite Error: Arguments are of the wrong type, are out of acceptable range, or are in conflict with one another. 4) Response.ContentType = contenttype Response.AddHeader "content-disposition","attachment;" & filename Response.BinaryWrite data response.end This works, but the file should be saving to the server instead of popping up save-as dialog. I am not sure if there is a way to save the response to file. Thanks for shedding light on any of these problems!

    Read the article

  • possibility to do type/data conversion of data returned by ria services?

    - by reinier
    My service returns an byte array, which I have to convert to an 'animated gif' (using imagetools since silverlight doesn't support it yet) I was wondering, is it possible to insert some code at the client, where I can do the conversion before the actual object is returned to whatever it is binded against? On the server side, the queries can be customized before it is sent over the wire. I'm asking for the exact opposite, can I do some on the fly conversions the moment they get of the wire and before they are returned to the controls. If I'm overthinking this and there is a smarter/better/easier way to do it, I'm all for such an answer as well

    Read the article

  • ASP.NET/VB/SQL: trying to insert data, getting error "no value given for required parameters"

    - by Sara
    I am pretty sure this is a basic syntax error, I am new at this and basically figuring things out by trial and error... I am trying to insert data from textboxes into an Access database, where the primary key fields in tableCourse are prefix and course_number. It keeps giving me the "no value given for one or more required parameters" error. Here is my codebehind: Protected Sub Wizard1_FinishButtonClick(ByVal sender As Object, ByVal e As System.Web.UI.WebControls.WizardNavigationEventArgs) Handles Wizard1.FinishButtonClick 'Collect Data Dim myDept = txtDept.Text Dim myFirst = txtFirstName.Text Dim myLast = txtLastName.Text Dim myPrefix = txtCoursePrefix.Text Dim myNum = txtCourseNum.Text 'Define Connection Dim myConn As New OleDbConnection myConn.ConnectionString = AccessDataSource1.ConnectionString 'Create commands Dim myIns1 As New OleDbCommand("INSERT INTO tableCourse (department, name_first, name_last, prefix, course_number) VALUES (@myDept, @myFirst, @myLast, @myPrefix, @myNum)", myConn) 'Execute the commands myConn.Open() myIns1.ExecuteNonQuery() End Sub

    Read the article

  • Best tool for monitoring Coldfusion interoperability with .Net web service

    - by John Galt
    Is there a tool to see the message sent to webservice hosted on an IIS server? I have a webservice written in .Net and our ColdFusion people are having trouble building a "complex" parameter. This problem is described from a ColdFusion perspective at: adobe forum question It runs when called from a .net client. While hosted on a server inside our LAN, I put it out on a public server so the WSDL could be viewed: please take a quick look at this WSDL here When the CF developer runs her code, she gets: java.lang.IllegalArgumentException: argument type mismatch ...and I am wondering if there is a tool I could run on the server that hosts my webservice to see if it is even entering the WS or is being rejected by Java code that CF uses and is not really even getting to my webservice.

    Read the article

  • MVC way of handling data input

    - by korki
    I have a data input module where I add the information of my product and its sub information like: product basic info product price info product price details price info and price details are related to product and are lists In my web forms approach I would store my main product object on the view state and I would populate it's pricing info and details while doing ajax postbacks. This way I can create a compact module that is very user friendly in terms of defining a lot of data from one place without the need to enter these data from seperate modules. And when I am done I would do one product.save() and that would persist all the data to the respective tables on db. Now I am building similar app on .net mvc framework and pondering on what would be the good way of handling this on mvc. I don't resonate towards storing all this on client side till I click save. And saving to the db after each action makes me remember the days I was coding on asp. Will appreciate your inputs on ways to approach this on mvc framework

    Read the article

  • How important is it for me to pool my connections?

    - by Rio Mango
    It has been suggested to me that I rearrange my code to "pool" my ADO connections. On each web page I would open one connection and keep using the same open connection. But someone else told me that was important 10 years ago but is not so important now. If I make, let's say, 5 db calls on a web posting, is it problematic to be using 5 separate connections that I open/close?

    Read the article

  • Cannot get to configure Kerberos for Reporting Services

    - by Ucodia
    Context I am trying to configure Kerberos in the domain for double-hop authentication. So here are the machines and their respective roles: client01: Windows 7 as client dc01: Windows Server 2008 R2 as domain controller and dns server01: Windows Server 2008 R2 as reporting server (native mode) server02: Windows Server 2008 R2 as SQL Server database engine I want my client01 to connect to server01 and configure a data source that is located on server02 using Intergrated Security. So as NTLM cannot push credentials that far, I need to setup Kerberos to enable double-hop authentication. The reporting service is runned by the Network Service service account and is configured only with the RSWindowsNegotiate options for authentication. Issue I cannot get to pass my client01 credential to server02 when configuring the data source on server01. Therefore I get the error: Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'. So I went on dc01 and delegated full trust for any service to server01 but it not fixed the problem. I want to notice that I did not configured any SPNs for server01 because Reporting Service is runned by Network Service and from what I read on the Internet, when Reporting Services is going up with Network Service, SPNs are automatically registered. My problem is that even if that I want to configure SPNs manually, I do not know where I have to set them up. On dc01 or on server01? So I went a bit further on the issue and tried to trace this problem. From my understanding of Kerberos, this is what should happen on the network when I try to connect the data source: client01 ---- AS_REQ ---> dc01 <--- AS_REP ---- client01 ---- TGS_REQ ---> dc01 <--- TGS_REP ---- client01 ---- AP_REQ ---> server01 <--- AP_REP ---- server01 ---- TGS_REQ ---> dc01 <--- TGS_REP ---- server01 ---- AP_REQ ---> server02 <--- AP_REP ---- So captured my local network with Wireshark, but whenever I try to configure my data source from client01 on server01 to pass my credentials to server02, my client never sends a AS_REQ or TGS_REQ to the KDC on dc01. Questions So does anyone can tell me if I should configure the SPNs and on which machine does it have to be configured? Also why client01 never request for a TGT or a TGS to my KDC. Do you think there is something going wrong with the DC role of dc01?

    Read the article

  • Updating Versioned .NET Assembly References

    - by ryrich
    I have a C++/CLI project that needs to reference a .NET assembly. I've done so by going into the project properties and clicking "Add New Reference", and browsing to the assembly location (it's not part of the solution, so I cannot create a project-to-project reference, and the .NET assembly is not in the GAC so it isn't in the .NET tab when viewing the references to add) When the .NET assembly is updated (that is, since it is versioned, it will increment its version number daily), the C++/CLI project fails to compile because it is still referencing the older version. The workaround I've been doing is deleting the .NET reference and adding it back in, but this is not feasible. How do I have it recognize the newer assembly?? Note: The older assembly is replaced with the newer one, so it is in the same location, but doesn't know that it should use the newer version.

    Read the article

  • Running a simple integration scenario using the Oracle Big Data Connectors on Hadoop/HDFS cluster

    - by hamsun
    Between the elephant ( the tradional image of the Hadoop framework) and the Oracle Iron Man (Big Data..) an english setter could be seen as the link to the right data Data, Data, Data, we are living in a world where data technology based on popular applications , search engines, Webservers, rich sms messages, email clients, weather forecasts and so on, have a predominant role in our life. More and more technologies are used to analyze/track our behavior, try to detect patterns, to propose us "the best/right user experience" from the Google Ad services, to Telco companies or large consumer sites (like Amazon:) ). The more we use all these technologies, the more we generate data, and thus there is a need of huge data marts and specific hardware/software servers (as the Exadata servers) in order to treat/analyze/understand the trends and offer new services to the users. Some of these "data feeds" are raw, unstructured data, and cannot be processed effectively by normal SQL queries. Large scale distributed processing was an emerging infrastructure need and the solution seemed to be the "collocation of compute nodes with the data", which in turn leaded to MapReduce parallel patterns and the development of the Hadoop framework, which is based on MapReduce and a distributed file system (HDFS) that runs on larger clusters of rather inexpensive servers. Several Oracle products are using the distributed / aggregation pattern for data calculation ( Coherence, NoSql, times ten ) so once that you are familiar with one of these technologies, lets says with coherence aggregators, you will find the whole Hadoop, MapReduce concept very similar. Oracle Big Data Appliance is based on the Cloudera Distribution (CDH), and the Oracle Big Data Connectors can be plugged on a Hadoop cluster running the CDH distribution or equivalent Hadoop clusters. In this paper, a "lab like" implementation of this concept is done on a single Linux X64 server, running an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0, and a single node Apache hadoop-1.2.1 HDFS cluster, using the SQL connector for HDFS. The whole setup is fairly simple: Install on a Linux x64 server ( or virtual box appliance) an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 server Get the Apache Hadoop distribution from: http://mir2.ovh.net/ftp.apache.org/dist/hadoop/common/hadoop-1.2.1. Get the Oracle Big Data Connectors from: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/index.html?ssSourceSiteId=ocomen. Check the java version of your Linux server with the command: java -version java version "1.7.0_40" Java(TM) SE Runtime Environment (build 1.7.0_40-b43) Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode) Decompress the hadoop hadoop-1.2.1.tar.gz file to /u01/hadoop-1.2.1 Modify your .bash_profile export HADOOP_HOME=/u01/hadoop-1.2.1 export PATH=$PATH:$HADOOP_HOME/bin export HIVE_HOME=/u01/hive-0.11.0 export PATH=$PATH:$HADOOP_HOME/bin:$HIVE_HOME/bin (also see my sample .bash_profile) Set up ssh trust for Hadoop process, this is a mandatory step, in our case we have to establish a "local trust" as will are using a single node configuration copy the new public keys to the list of authorized keys connect and test the ssh setup to your localhost: We will run a "pseudo-Hadoop cluster", in what is called "local standalone mode", all the Hadoop java components are running in one Java process, this is enough for our demo purposes. We need to "fine tune" some Hadoop configuration files, we have to go at our $HADOOP_HOME/conf, and modify the files: core-site.xml hdfs-site.xml mapred-site.xml check that the hadoop binaries are referenced correctly from the command line by executing: hadoop -version As Hadoop is managing our "clustered HDFS" file system we have to create "the mount point" and format it , the mount point will be declared to core-site.xml as: The layout under the /u01/hadoop-1.2.1/data will be created and used by other hadoop components (MapReduce = /mapred/...) HDFS is using the /dfs/... layout structure format the HDFS hadoop file system: Start the java components for the HDFS system As an additional check, you can use the GUI Hadoop browsers to check the content of your HDFS configurations: Once our HDFS Hadoop setup is done you can use the HDFS file system to store data ( big data : )), and plug them back and forth to Oracle Databases by the means of the Big Data Connectors ( which is the next configuration step). You can create / use a Hive db, but in our case we will make a simple integration of "raw data" , through the creation of an External Table to a local Oracle instance ( on the same Linux box, we run the Hadoop HDFS one node cluster and one Oracle DB). Download some public "big data", I use the site: http://france.meteofrance.com/france/observations, from where I can get *.csv files for my big data simulations :). Here is the data layout of my example file: Download the Big Data Connector from the OTN (oraosch-2.2.0.zip), unzip it to your local file system (see picture below) Modify your environment in order to access the connector libraries , and make the following test: [oracle@dg1 bin]$./hdfs_stream Usage: hdfs_stream locationFile [oracle@dg1 bin]$ Load the data to the Hadoop hdfs file system: hadoop fs -mkdir bgtest_data hadoop fs -put obsFrance.txt bgtest_data/obsFrance.txt hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$ hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$hadoop fs -ls hdfs:///user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt Check the content of the HDFS with the browser UI: Start the Oracle database, and run the following script in order to create the Oracle database user, the Oracle directories for the Oracle Big Data Connector (dg1 it’s my own db id replace accordingly yours): #!/bin/bash export ORAENV_ASK=NO export ORACLE_SID=dg1 . oraenv sqlplus /nolog <<EOF CONNECT / AS sysdba; CREATE OR REPLACE DIRECTORY osch_bin_path AS '/u01/orahdfs-2.2.0/bin'; CREATE USER BGUSER IDENTIFIED BY oracle; GRANT CREATE SESSION, CREATE TABLE TO BGUSER; GRANT EXECUTE ON sys.utl_file TO BGUSER; GRANT READ, EXECUTE ON DIRECTORY osch_bin_path TO BGUSER; CREATE OR REPLACE DIRECTORY BGT_LOG_DIR as '/u01/BG_TEST/logs'; GRANT READ, WRITE ON DIRECTORY BGT_LOG_DIR to BGUSER; CREATE OR REPLACE DIRECTORY BGT_DATA_DIR as '/u01/BG_TEST/data'; GRANT READ, WRITE ON DIRECTORY BGT_DATA_DIR to BGUSER; EOF Put the following in a file named t3.sh and make it executable, hadoop jar $OSCH_HOME/jlib/orahdfs.jar \ oracle.hadoop.exttab.ExternalTable \ -D oracle.hadoop.exttab.tableName=BGTEST_DP_XTAB \ -D oracle.hadoop.exttab.defaultDirectory=BGT_DATA_DIR \ -D oracle.hadoop.exttab.dataPaths="hdfs:///user/oracle/bgtest_data/obsFrance.txt" \ -D oracle.hadoop.exttab.columnCount=7 \ -D oracle.hadoop.connection.url=jdbc:oracle:thin:@//localhost:1521/dg1 \ -D oracle.hadoop.connection.user=BGUSER \ -D oracle.hadoop.exttab.printStackTrace=true \ -createTable --noexecute then test the creation fo the external table with it: [oracle@dg1 samples]$ ./t3.sh ./t3.sh: line 2: /u01/orahdfs-2.2.0: Is a directory Oracle SQL Connector for HDFS Release 2.2.0 - Production Copyright (c) 2011, 2013, Oracle and/or its affiliates. All rights reserved. Enter Database Password:] The create table command was not executed. The following table would be created. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081035-74-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files would be created. osch-20131022081035-74-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt Then remove the --noexecute flag and create the external Oracle table for the Hadoop data. Check the results: The create table command succeeded. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081719-3239-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files were created. osch-20131022081719-3239-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt This is the view from the SQL Developer: and finally the number of lines in the oracle table, imported from our Hadoop HDFS cluster SQL select count(*) from "BGUSER"."BGTEST_DP_XTAB"; COUNT(*) ---------- 1151 In a next post we will integrate data from a Hive database, and try some ODI integrations with the ODI Big Data connector. Our simplistic approach is just a step to show you how these unstructured data world can be integrated to Oracle infrastructure. Hadoop, BigData, NoSql are great technologies, they are widely used and Oracle is offering a large integration infrastructure based on these services. Oracle University presents a complete curriculum on all the Oracle related technologies: NoSQL: Introduction to Oracle NoSQL Database Using Oracle NoSQL Database Big Data: Introduction to Big Data Oracle Big Data Essentials Oracle Big Data Overview Oracle Data Integrator: Oracle Data Integrator 12c: New Features Oracle Data Integrator 11g: Integration and Administration Oracle Data Integrator: Administration and Development Oracle Data Integrator 11g: Advanced Integration and Development Oracle Coherence 12c: Oracle Coherence 12c: New Features Oracle Coherence 12c: Share and Manage Data in Clusters Oracle Coherence 12c: Oracle GoldenGate 11g: Fundamentals for Oracle Oracle GoldenGate 11g: Fundamentals for SQL Server Oracle GoldenGate 11g Fundamentals for Oracle Oracle GoldenGate 11g Fundamentals for DB2 Oracle GoldenGate 11g Fundamentals for Teradata Oracle GoldenGate 11g Fundamentals for HP NonStop Oracle GoldenGate 11g Management Pack: Overview Oracle GoldenGate 11g Troubleshooting and Tuning Oracle GoldenGate 11g: Advanced Configuration for Oracle Other Resources: Apache Hadoop : http://hadoop.apache.org/ is the homepage for these technologies. "Hadoop Definitive Guide 3rdEdition" by Tom White is a classical lecture for people who want to know more about Hadoop , and some active "googling " will also give you some more references. About the author: Eugene Simos is based in France and joined Oracle through the BEA-Weblogic Acquisition, where he worked for the Professional Service, Support, end Education for major accounts across the EMEA Region. He worked in the banking sector, ATT, Telco companies giving him extensive experience on production environments. Eugen currently specializes in Oracle Fusion Middleware teaching an array of courses on Weblogic/Webcenter, Content,BPM /SOA/Identity-Security/GoldenGate/Virtualisation/Unified Comm Suite) throughout the EMEA region.

    Read the article

  • How to show or direct a business analyst to a data modelling subject?

    - by AaronLS
    Our business analysts pushed hard to collect data through a spreadsheet. I am the programmer responsible for importing that data. Usually when they push hard for something like this, I never know how well it will work out until a few weeks later when I have time assigned to work on the task of programming the import of the data. I have tried to do as much as possible along the way, named ranges, data validations, etc. But I usually don't have time to take a detailed look at all the data and compare to the destination in the database to determine how well it matches up. A lot of times there will be maybe a little table of items that somehow I have to relate to something else in the database, but there are not natural or business keys present that would allow me to do so. Make the best of this, trying to write something that can compare strings and make a best guess at it and then go through the effort of creating interfaces for a user to match the imported data to the destination. I feel like if the business analyst was actually creating a data model, they would be forced to think about these relationships, and have an appreciation for the need of natural or business keys to be part of the spreadsheet for the purposes of smoothly importing the data. The closest they come to business analysis is a big flat list of fields, and that would be fine if it were like any other data dictionary and include data types+relationships, but it isn't. They are just a bunch of names. No indication of what type of data they might hold, and it is up to me to guess. When I have pushed for more detail, they say that it is just busy work. How can I explain the importance of data modelling? How can I tell them what it is and how to do it? It feels impossible, because they don't have an appreciation for its importance. They do however, usually have an interest in helping out in whatever way they can, it's just this in particular has never gotten a motivated response.

    Read the article

  • Ray Tracing concers: Efficient Data Structure and Photon Mapping

    - by Grieverheart
    I'm trying to build a simple ray tracer for specific target scenes. An example of such scene can be seen below. I'm concerned as to what accelerating data structure would be most efficient in this case since all objects are touching but on the other hand, the scene is uniform. The objects in my ray tracer are stored as a collection of triangles, thus I also have access to individual triangles. Also, when trying to find the bounding box of the scene, how should infinite planes be handled? Should one instead use the viewing frustum to calculate the bounding box? A few other questions I have are about photon mapping. I've read the original paper by Jensen and many more material. In the compact data structure for the photon they introduce, they store photon power as 4 chars, which from my understanding is 3 chars for color and 1 for flux. But I don't understand how 1 char is enough to store a flux of the order of 1/n, where n is the number of photons (I'm also a bit confused about flux vs power). The other question about photon mapping is, if it would be more efficient in my case to store photons per object (or even per Object's triangle) instead of using a balanced kd-tree. Also, same question about bounding box of the scene but for photon mapping. How should one find a bounding box from the pov of the light when infinite planes are involved?

    Read the article

  • If I implement a web-service, how do I respond to POST requests with JSON?

    - by Vova Stajilov
    I have to make a rather complex system for my diploma work. Logically it will consist of the following components: Database Web-service Management with web interface Client iOS application that will consume web-service I decided to implement all the first three components under .NET. Firstly I will create a database depending on the information load - this is clear. But then I need a web-service that will return data in JSON format for iOS clients to consume - that's obvious and not that hard to implement. For this I will use WCF technology. Now I have a question, if I implement the web-service, how will I be able to respond to POST requests with JSON? It probably involves WCF JSON or something related? But I also need some web pages as admin part, so will this web-application be able to consume my centralized web-services as well or I should develop it separately? I just want my web service to act like a set of controllers. There is a related question here but this doesn't quite reflect my question.

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >