Search Results

Search found 60427 results on 2418 pages for 'data transfer'.

Page 25/2418 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • 12.1.3 Spares Management Enhancements Transfer of Information (TOI)

    - by Oracle_EBS
    Transfer of Information (TOI) presentation is available. It covers the following enhancements made to the EBS Spares Management Product: Restrict Sources with no Shipping Network definition Create Internal Order when Source is Manned Warehouse Display Delivery status in Parts Requirement UI Order Sources by distance when Shipping cost remains same Calculate Parts Shipping Distances using Navteq Data Consider Warehouse Calendar to calculate Parts Arrival Date Create Requisitions in Operating Unit of Destination Inventory Org Uptake of HZ address structure in Parts Requirement UI

    Read the article

  • iPhone: Best Method for Passing Data to and from a Server

    - by SAPNA
    I am developing an iPhone application that downloads data from a website. The website database is implemented in SQL and the site itself uses the classic ASP interface. I am unsure as to which method would be best for transferring data to and from the server. Both JSON and SOAP require XML processing and I'm not sure how that affects performance or which of those two is best. What would be the best method in general for data transfer given the server configuration we currently have? I very new to this field and I'm a bit confused. Any help would be appreciated.

    Read the article

  • JavaScript Data Binding Frameworks

    - by dwahlin
    Data binding is where it’s at now days when it comes to building client-centric Web applications. Developers experienced with desktop frameworks like WPF or web frameworks like ASP.NET, Silverlight, or others are used to being able to take model objects containing data and bind them to UI controls quickly and easily. When moving to client-side Web development the data binding story hasn’t been great since neither HTML nor JavaScript natively support data binding. This means that you have to write code to place data in a control and write code to extract it. Although it’s certainly feasible to do it from scratch (many of us have done it this way for years), it’s definitely tedious and not exactly the best solution when it comes to maintenance and re-use. Over the last few years several different script libraries have been released to simply the process of binding data to HTML controls. In fact, the subject of data binding is becoming so popular that it seems like a new script library is being released nearly every week. Many of the libraries provide MVC/MVVM pattern support in client-side JavaScript apps and some even integrate directly with server frameworks like Node.js. Here’s a quick list of a few of the available libraries that support data binding (if you like any others please add a comment and I’ll try to keep the list updated): AngularJS MVC framework for data binding (although closely follows the MVVM pattern). Backbone.js MVC framework with support for models, key/value binding, custom events, and more. Derby Provides a real-time environment that runs in the browser an in Node.js. The library supports data binding and templates. Ember Provides support for templates that automatically update as data changes. JsViews Data binding framework that provides “interactive data-driven views built on top of JsRender templates”. jQXB Expression Binder Lightweight jQuery plugin that supports bi-directional data binding support. KnockoutJS MVVM framework with robust support for data binding. For an excellent look at using KnockoutJS check out John Papa’s course on Pluralsight. Meteor End to end framework that uses Node.js on the server and provides support for data binding on  the client. Simpli5 JavaScript framework that provides support for two-way data binding. WinRT with HTML5/JavaScript If you’re building Windows 8 applications using HTML5 and JavaScript there’s built-in support for data binding in the WinJS library.   I won’t have time to write about each of these frameworks, but in the next post I’m going to talk about my (current) favorite when it comes to client-side JavaScript data binding libraries which is AngularJS. AngularJS provides an extremely clean way – in my opinion - to extend HTML syntax to support data binding while keeping model objects (the objects that hold the data) free from custom framework method calls or other weirdness. While I’m writing up the next post, feel free to visit the AngularJS developer guide if you’d like additional details about the API and want to get started using it.

    Read the article

  • Protect Data and Save Money? Learn How Best-in-Class Organizations do Both

    - by roxana.bradescu
    Databases contain nearly two-thirds of the sensitive information that must be protected as part of any organization's overall approach to security, risk management, and compliance. Solutions for protecting data housed in databases vary from encrypting data at the application level to defense-in-depth protection of the database itself. So is there a difference? Absolutely! According to new research from the Aberdeen Group, Best-in-Class organizations experience fewer data breaches and audit deficiencies - at lower cost -- by deploying database security solutions. And the results are dramatic: Aberdeen found that organizations encrypting data within their databases achieved 30% fewer data breaches and 15% greater audit efficiency with 34% less total cost when compared to organizations encrypting data within applications. Join us for a live webcast with Derek Brink, Vice President and Research Fellow at the Aberdeen Group, next week to learn how your organization can become Best-in-Class.

    Read the article

  • Bad Data is Really the Monster

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Bad Data is really the monster – is an article written by Bikram Sinha who I borrowed the title and the inspiration for this blog. Sinha writes: “Bad or missing data makes application systems fail when they process order-level data. One of the key items in the supply-chain industry is the product (aka SKU). Therefore, it becomes the most important data element to tie up multiple merchandising processes including purchase order allocation, stock movement, shipping notifications, and inventory details… Bad data can cause huge operational failures and cost millions of dollars in terms of time, resources, and money to clean up and validate data across multiple participating systems. Yes bad data really is the monster, so what do we do about it? Close our eyes and hope it stays in the closet? We’ve tacked this problem for some years now at Oracle, and with our latest introduction of Oracle Enterprise Data Quality along with our integrated Oracle Master Data Management products provides a complete, best-in-class answer to the bad data monster. What’s unique about it? Oracle Enterprise Data Quality also combines powerful data profiling, cleansing, matching, and monitoring capabilities while offering unparalleled ease of use. What makes it unique is that it has dedicated capabilities to address the distinct challenges of both customer and product data quality – [different monsters have different needs of course!]. And the ability to profile data is just as important to identify and measure poor quality data and identify new rules and requirements. Included are semantic and pattern-based recognition to accurately parse and standardize data that is poorly structured. Finally all of the data quality components are integrated with Oracle Master Data Management, including Oracle Customer Hub and Oracle Product Hub, as well as Oracle Data Integrator Enterprise Edition and Oracle CRM. Want to learn more? On Tuesday Nov 15th, I invite you to listen to our webcast on Reduce ERP consolidation risks with Oracle Master Data Management I’ll be joined by our partner iGate Patni and be talking about one specific way to deal with the bad data monster specifically around ERP consolidation. Look forward to seeing you there!

    Read the article

  • Where can I locate business data to use in my application?

    - by Aaron McIver
    This question talks about any and all free public raw data which appeared to have valuable pieces but nothing that really provides what I am looking for. Instead of using a socially defined listing of businesses (foursquare), I would like a business listing data set of registered businesses and associated addresses that could then be searchable based on location (coordinates). The critical need is that the data set should be filterable based on varying criteria (give me all restaurants, coffee shops, etc...). If the data is free that is great but anywhere that sells this type of data would also suffice. Infochimps looked like a possibility but perhaps something a bit more extensive exists. Where can I find a free or for fee data set of registered business that is filterable based on type of business and location?

    Read the article

  • Transfer file using BITS without using IIS on one end?

    - by rwmnau
    I know that I can transfer files using BITS and a wrapper like SharpBITS, but it seems that I need an IIS server on one end - either to upload to or download from. Is there a way to use the BITS protocol to transfer a file without requiring IIS? Some kind of a "BITS Server" or "Listener" project that my client's BITS service can connect to. I'm looking for functionality that's exactly what BITS provides, but I'd prefer not to require that IIS be installed (though if I have to, I can). Thanks!

    Read the article

  • How do you transfer AWS RDS snap shot to a different AWS account

    - by Webmonger
    Hi I have an RDS database that I need to transfer a snapshot of to another AWS account. I understand there are issues being able to do this between availability zones so I'm really unsure if this is possible. The RDS instance is mySql. If it's not possible to transfer the snapshot please could you explain how to transfer the data from one RDS instance to another without downloading any if the contents(The DB is over 200GB). Thanks in advance

    Read the article

  • Transfer file using BITS without using IIS as the server?

    - by rwmnau
    I know that I can transfer files using BITS and a wrapper like SharpBITS, but it seems that I need an IIS server on one end - either to upload to or download from. Is there a way to use the BITS protocol to transfer a file without requiring IIS? Some kind of a "BITS Server" or "Listener" project that my client's BITS service can connect to. I'm looking for functionality that's exactly what BITS provides, but I'd prefer not to require that IIS be installed (though if I have to, I can). Thanks!

    Read the article

  • Is there a remote file transfer command that preserves nanosecond timestamps?

    - by Denver Gingerich
    I've tried transferring files using scp and rsync on Ubuntu 10.04, but neither of them preserves more than second precision. Here's an example: $ touch test1 $ scp -p test1 localhost:test2 $ ls -l --full-time test* -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.579717282 -0500 test1 -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.000000000 -0500 test2 $ cp -p test1 test2 $ ls -l --full-time test* -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.579717282 -0500 test1 -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.579717282 -0500 test2 $ A straight copy works fine, but scp truncates the timestamp. Are there any tools (preferably similar to scp or rsync in their usage) that do remote file transfers while preserving nanosecond timestamps? I could write a hacky script to do it, but I'd rather not.

    Read the article

  • AFP, SMB, NFS which is the best data transfer protocol ?

    - by Kami
    I have a computer with large hard disks running Gentoo. I have to serve med/big files via a wired network to Apple devices (all of them running OS X). Which protocol is the best for the following needs ? : Speed Ease of use (by the clients and the server) Less limited (max file size, limited charset for filenames) Security

    Read the article

  • Is there an easy way to mass-transfer all files between two computers?

    - by Raven Dreamer
    I'm currently visiting my parents for the Christmas holidays, and as I'm sure is the case of many in the post-tech generation, my arrival brings with it the assumption of personal tech support. I've been tasked with setting up my folks' new computer, and they want to make sure all their files, yes, all their files, get transfered over to the new device. Short of manually dragging each folder in the C:/ directory onto an external hard drive and then out onto the recipient computer, is there an easier / faster way to do this?

    Read the article

  • Content Length and Transfer Encoding Chunked nginx, node-http-proxy

    - by rampr
    I have the following setup - node-http-proxy acts as a reverse proxy forwarding all requests to nginx/socket.io as necessary My problem is this When I send a HTTP DELETE request from the browser, node-http-proxy adds a header "Transfer Encoding Chunked" as the request from the browser had no Content Length. The request from the browser had no Content Length as it had no body. Nginx doesn't like the Transfer Encoding Chunked Header and throws a 411 asking for Content-Length. The problem gets solved when I send dummy data as part of the DELETE request so there is a Content Length and node-http-proxy doesn't add Transfer Encoding Chunked header and nginx is happy. I want to understand if node-http-proxy isn't working as expected, because it adds a Transfer Encoding Chunked header when Content Length is missing because there is no Content Body.

    Read the article

  • Fast (non-blocking) way to transfer many files to another server

    - by Nyxynyx
    I am currently attempting to transfer over 1 million files from one server to another. Using wget, it seems to be extremely slow, probably because it starts a new transfer after the previous one has been completed. Question: Is there a faster non-blocking (asynchronous) way to do the transfer? I do not have enough space on the first server to compress the files into tar.gz and transferring them over. Thanks!

    Read the article

  • XP can't read data from transferred HD

    - by Alexander Miller
    Computer A, running XP, died. XP was installed on a fresh HD in computer B. Slaved data-backup HD from A was installed as slave on B. B will not read it; shows only 2 folders, Recycler and System Volume Info. All of these are older machines with IDE drives. What's going on & how can I read/transfer the data from the transferred drive? This was only a trial run. Actually I will need to transfer the master HD from A - which has XP on it - and read from its data partition because (blush) the backup drive was not up to date.

    Read the article

  • How to transfer files from windows 7 pc to vista

    - by Samuel C
    Tried using direct connection using Ethernet cable and using windows easy transfer, but no luck, also tried using ad-hoc, home group, and connecting through a router both wired and wireless but no luck, im getting a little frustrated as I need to transfer these files because im selling one of them this afternoon! All I need to do is transfer some documents and files. The windows 7 pc recognizes the vista pc but vista cant recognize the win 7.

    Read the article

  • Is there any chance that my data will get silently corrupted with a robocopy SMB network transfer?

    - by Archagon
    I'm setting up a NAS box for the first time. At the moment, I have most of my data backed up to a few local hard drives, and I intend to transfer all the data to my NAS over ethernet once the RAID array is setup. Since this is all happening over the network, I'm a bit worried about my data getting corrupted silently during transfer. From what I understand, data generally doesn't get corrupted without notice on local transfers because a checksum is performed at some point by the drive or the OS. (This could be totally wrong.) Does the same thing happen with SMB, or is it up to the transferrer to check the integrity of their data? And if it doesn't happen with SMB, is there a protocol that does ensure data integrity? I know that rsync can checksum a transfer, but I'm on Windows and I already have a robocopy configuration that I like. Will my data be safe or do I have to use an external checksum tool to make sure?

    Read the article

  • Sybase PowerDesigner Change Many (Find/Replace/Convert) Data Item's Data Types

    - by Andy
    Hello, I have a relatively large Conceptual Data Model in PowerDesigner. After generating a Physical Data Model and seeing the DBMS data types, I need to update all of data types(NUMBER/TEXT) for each data item. I'd like to either do a find/replace within the Conceptual Data Model or somehow map to different data types when creating the Physical Data Model. Ex. Change the auto conversion of Text - Clob, to Text - NVARCHAR(20). Thanks!

    Read the article

  • are there any useful datasets available on the web for data mining?

    - by niko
    Hi, Does anyone know any good resource where example (real) data can be downloaded for experimenting statistics and machine learning techniques such as decision trees etc? Currently I am studying machine learning techniques and it would be very helpful to have real data for evaluating the accuracy of various tools. If anyone knows any good resource (perhaps csv, xls files or any other format) I would be very thankful for a suggestion.

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >