Search Results

Search found 59643 results on 2386 pages for 'data migration'.

Page 749/2386 | < Previous Page | 745 746 747 748 749 750 751 752 753 754 755 756  | Next Page >

  • Creating Indicator and Gauge Report in SSRS - SQL Server 2008 R2

    SQL Server Reporting Services allows you to embed indicators and Gauges in your report to analyze the data and its state. Indicators are minimal gauges that convey the state of a single data value at a glance and are mostly used to represent state value of Key Performance Indicator (KPI). New! SQL Monitor 3.0 Red Gate's multi-server performance monitoring and alerting tool gets results from Day One.Simple to install and easy to use – download a free trial today.

    Read the article

  • Oracle Magazine, March/April 2005

    Oracle Magazine March/April 2005 features articles on managing unstructured content, cooridinating business processes, Oracle's Austin Data Center, starting with Oracle ADF, Oracle XML Data Synthesis, SQL analytics, using materialized views, and much more.

    Read the article

  • Tissue Specific Electrochemical Fingerprinting on the NetBeans Platform

    - by Geertjan
    Proteomics and metalloproteomics are rapidly developing interdisciplinary fields providing enormous amounts of data to be classified, evaluated, and interpreted. Approaches offered by bioinformatics and also by biostatistical data analysis and treatment are therefore becoming increasingly relevant. A bioinformatics tool has been developed at universities in Prague and Brno, in the Czech Republic, for analysis and visualization in this domain, on the NetBeans Platform: More info:  http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0049654

    Read the article

  • XML DATATYPE (series 1)

    New to SQL Server 2005, is The XML data type, which lets you store XML documents and fragments in a SQL Server database. An XML fragment is an XML instance that is missing a single top-level element. You can create columns and variables of the XML type and store XML instances in them. Note that the stored representation of XML data type instances cannot exceed 2 GB.

    Read the article

  • The Benefits of Using Ongoing Search Engine Optimization Services

    Many new clients of search engine optimization companies are unsure if they should opt for one-time search engine optimization services or ongoing services. One-time SEO services are appropriate for new websites for several reasons. Once you do the initial optimization of a new website, you have to wait a while in order to receive data that will help you determine the direction in which you need to optimize the site. A period of six months or so will give you enough data to determine what will and won't work for the site.

    Read the article

  • SQL Server 2012 Integration Services - Implementing Package Security using Access Control

    SQL Server 2012 Integration Services offers a wide range of powerful features that allow you to streamline and automate tasks involving data extraction, transformation, and loading. However, incorporating these features into your existing business intelligence framework frequently necessitates additional security measures ensuring that data which is being processed remains protected from unauthorized access.

    Read the article

  • Custom Business Application Development in PHP - Features & Benefits

    A revolution is taking place within the business application arena today. Just a few years back most custom business applications such as CRM, ERP, data mining and other business data information systems were inflexible and expensive applications. The database ran on a server within the company's compounds and each desktop machine was running a client application.

    Read the article

  • Oracle Magazine, November/December 2008

    Oracle Magazine November/December features articles on our Editors' Choice Awards 2008, the new HP Oracle Database Machine, using task flows, Cursor FOR Loops, Oracle Data Access Components, Oracle Active Data Guard, SQL Developer and PL/SQL constructs, Oracle Database 11g, questions for Tom Kyte and much more.

    Read the article

  • Is there a way to replicate a very large file shares in real-time?

    - by fsckin
    I have an hourly cron job that copies about 40GB of data from a source folder into a new folder with the hour appended on the end. When it's done, the job prunes anything older than 24 hours. This data changes very often during work hours and is on a samba file share. Here's how the folder structure looks: \server\Version.1 \server\Version.2 \server\Version.3 ... \server\Version.24 The contents of each new folder compared to the last one usually doesn't change very much, since this is a hourly job. Now you might be thinking that I'm an idiot for setting dreaming this up. Truth is, I just found out. It's actually been used for years and is so incredibly simple, anyone could delete the ENTIRE 40GB share (imagine that dialog spooling up... deleting thousands and thousands of files) and it would actually be faster to restore by moving the latest copy back to the source than it took to delete. Brilliant! Now to top this off, I need to efficiently replicate this 960GB of "mostly similar" data to a remote server over WAN link, with the replication happening as close to real-time as possible -- think hot spare, disaster recovery, etc. My first thought was rsync. Total failure. Rsync sees it sees a deletion of the folder that is 24 hours old and the addition of a new folder with 30GB of data to sync! I also looked at rdiff-backup and unison, they both appear to use similar algorithms and do not keep enough meta-data to do this intelligently. Best thing that I can find "out of the box" to do this is Windows Server "Distributed Filesystem Replication" which uses "Remote Differential Compression" -- After reading the background information on how this works, it actually looks like exactly what I need. Problem: Both servers are running Linux. D'oh! One approach to this I'm looking at is this, say it's 5AM and the cron job finishes: New Version.5 folder arrives at on local server SSH to remote server and copy Version.4 to Version.5 Run rsync on the local server pushing changes to the remote server. Rsync finally knows to do a differential copy between Version.4 and Version.5 Is there a smarter way to replicate Samba shares as close to real-time as possible? Anything out there that does "Remote Differential Compression" on Linux?

    Read the article

  • ssh connection error

    - by evaG
    I'm trying to log into a ubuntu desktop. I get the following error message: PTY allocation request failed What does it mean and how to connect to my desktop ? Thanks edit: debug1: Reading configuration data /home/evag/.ssh/config debug1: /home/evag/.ssh/config line 1: Applying options for * debug1: Reading configuration data /etc/ssh/ssh_config debug1: /etc/ssh/ssh_config line 19: Applying options for * debug1: auto-mux: Trying existing master debug1: mux_client_request_session: master session id: 2 PTY allocation request failed

    Read the article

  • No Silverlight and Preloader Experience(ish) - in 10 seconds...

    here is the basic's...:<body background="Img/ScreenShot.png" ><object id="SilverlightControlObj" name="SilverlightControlObjNm"data="data:application/x-silverlight-2,"type="application/x-silverlight-2" width="100%"height="100%"><param name="source" value="ClientBin/SUGWTK.xap"/><param name="onError" value="onSilverlightError" /><param name="background" value="white" /><param name="minRuntimeVersion" value="3.0.40624.0" /><param name="autoUpgrade" value="true"...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Problem with partitions and 12.04

    - by Wejq
    Can I have some help? I tried to install Ubuntu 12.04 on my laptop, that has some partitions: 2 ntfs (one of them is restricted by the system) 2 ext4s 1 Linux swap But when I insert my CD, and run liveCD (as I am now), the installer can't see any of my partitions, it can see only /dev/sda as unallocated place, so does gparted (Fdisk seems ok). On these partitions I have data, that I can use by windows on NTFS's. Here is some of this data:

    Read the article

  • How to Use KDE's Clipboard and Klipper App

    <b>MakeTechEasier:</b> "KDE has an advanced clipboard system, largely due to a small program called Klipper, which can store more than one piece of data. KDE also has the ability to copy and move files with copying and pasting, and automatic creation of files using clipboard data."

    Read the article

  • A quick look at: sys.dm_os_buffer_descriptors

    - by fatherjack
    SQL Server places data into cache as it reads it from disk so as to speed up future queries. This dmv lets you see how much data is cached at any given time and knowing how this changes over time can help you ensure your servers run smoothly and are adequately resourced to run your systems. This dmv gives the number of cached pages in the buffer pool along with the database id that they relate to: USE [tempdb] GO SELECT COUNT(*) AS cached_pages_count , CASE database_id ...(read more)

    Read the article

  • Oracle Snapshot Not Working [closed]

    - by nayef harb
    i have created a snapshot that takes data from 2 tables and has a refresh rate of 1 day. The snapshot data is not refreshing it is still the same. is there something that i am missing ? Here is the code: CREATE SNAPSHOT test REFRESH COMPLETE START WITH SYSDATE NEXT sysdate + 1 AS select item_code,item_conc_code,tran_bran_code,sum(tran_qty) bal_qty from tranhist a, itemmast b where a.tran_item_code = b.item_code group by item_code,item_conc_code,tran_bran_code

    Read the article

  • The Importance of Backing Up Your Website Or Blog

    Backing up a site or blog consists of storing files and data in another location. That way, if something should happen to your site or blog, you'll still have a copy of all the data. Backing up the information isn't all that difficult, and you can save a lot of time and effort in doing so.

    Read the article

  • Use Thread-local Storage to Reduce Synchronization

    Synchronization is often an expensive operation that can limit the performance of a multithreaded program. Using thread-local data structures instead of data structures shared by the threads can reduce synchronization in certain cases, allowing a program to run faster.

    Read the article

  • Developing a Custom SSIS Source Component

    SSIS was designed to be extensible. Although you can create tasks that will take data from a wide variety of sources, transform the data is a number of ways and write the results a wide choice of destinations, using the components provided, there will always be occasions when you need to customise your own SSIS component. Yes, it is time to hone up your C# skills and cut some code, as Saurabh explains.

    Read the article

  • Proper set up shared folders for users

    - by user221486
    First I would like to say thanks for helping, and I have huge problem with proper set up permission for shared folders. I have Windows 7 x64 ent. - name: backupfb - added to domain with shared folder on drive e: (e:\backup) 50 clients/laptops with TSM Tivoli fastback for workstations who save files on shared folder And I need to configure proper permission for my shared folders that only owner of folder can access to their folders. Folder structure is: e:\backup <- shared as a "backup" folder \\backupfb\backup\ e:\backup\BackupAdmin <-- directory is used by the Tivoli Storage Manager FastBack for Workstations client to download revisions and configurations. Nodes require read-only access to these directories e:\backup\RealTimeBackup <-- enable client accounts to create directories that are only accessible by the account that created them. As a result, the directory that contains data for a node is not created until that node connects to the server. So permission should look like that (take from instructions): Inheritable permissions from object`s parents are DISABLE Permission entries: \\backupfb\backup\BackupAdmin Allow Users Read, Execute This folder, subfolders, and files Traverse Folder / Execute Allow List Folder / Read Data Allow Read Attributes Allow Read Extended Attributes Allow Delete subfolders and files Allow Delete Allow Read Permission’s Allow Allow Administrators Full Control This folder, subfolders, and files Both folders have enabled option "apply these permissions to objects and/or containers within this container only" Here everything works fine \\backupfb\backup\RealTimeBackup <<-- Allow Administrators Full Control This folder, subfolders, and files Allow CREATOR OWNER Full Control This folder, subfolders, and files (from domain) Allow Users Special This folder only Traverse Folder / Execute Allow List Folder / Read Data Allow Read Attributes Allow Read Extended Attributes Allow Create Files / Write Data Allow Create Folders / Append Data Allow Delete subfolders and files Allow Read Permission’s Allow Allow OWNER RIGHTS* Full Control This folder, subfolders, and files Here I have huge problem with CREATOR OWNER Im able to set FULL CONTROL but I can only apply "Subfolders and files only". When I change props. to "This folder, subfolders and files" and save its change to "Subfolders and files only" So I try use icacls to set up permissions @echo off takeown /F E:\backup\ /R /A for /D %%i IN (E:\backup\RealTimeBackup*) DO icacls E:\backup\RealTimeBackup\%%~nxi /grant:r cloud\%%~nxi:F /T /C pause but after that user are able to create just one folder in \backupfb\backup\RealTimeBackup\userfolder but problem is with subfolders In log i have: FBW5022E Unable to access the specified file Explanation: The file specified is unable to be accessed. Possibly spelled incorrectly, or bad path, or permissions. User response: Ensure the user has the proper permissions for the file and directories involved andthat the file and directory exist Any idea ?? pls help ;-) thanks

    Read the article

  • The Perils of Running Database Repair

    In a perfect world everyone has the right backups to be able to recover within the downtime and data-loss service level agreements when accidental data loss or corruption occurs. Unfortunately we don’t live in a perfect world and so many people find that they don’t have the backups they need to recover when faced with corruption. What are your servers really trying to tell you? Find out with new SQL Monitor 3.0, an easy-to-use tool built for no-nonsense database professionals.For effortless insights into SQL Server, download a free trial today.

    Read the article

  • Openmpi 1.6.3 on ubuntu 12.10

    - by torem
    I manually installed the tar.gz of openmpi 1.6.3 on Ubuntu 12.10. But now mpif90.openmpi returns the following: Cannot open configuration file /usr/local/share/openmpi/ mpif90.openmpi-wrapper- data.txt Error parsing data file mpif90.openmpi: Not found How can I get mpif90.openmpi get running again? It was running fine if I install openmpi using apt-get install. But that way I will get only version 1.6.1. Thanks.

    Read the article

  • Developing a Tablet Application for Sales Force

    Mobile access to data is becoming part of our lives. Not only in personal sphere, but especially in variety of industries can be seen a growing demand for mobile data access. People use laptops, smartphones and pocket PCs and the market is opening for tablets more than ever before. Let's find out what developers might use when developing tablet applications.

    Read the article

  • The Importance of Backing Up Your Website Or Blog

    Backing up a site or blog consists of storing files and data in another location. That way, if something should happen to your site or blog, you'll still have a copy of all the data. Backing up the information isn't all that difficult, and you can save a lot of time and effort in doing so.

    Read the article

< Previous Page | 745 746 747 748 749 750 751 752 753 754 755 756  | Next Page >