Search Results

Search found 58566 results on 2343 pages for 'data modelling'.

Page 77/2343 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Transparent Data Encryption

    Transparent Data Encryption is designed to protect data by encrypting the physical files of the database, rather than the data itself. Its main purpose is to prevent unauthorized access to the data by restoring the files to another server. With Transparent Data Encryption in place, this requires the original encryption certificate and master key. It was introduced in the Enterprise edition of SQL Server 2008. John Magnabosco explains fully, and guides you through the process of setting it up....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Forbes Article on Big Data and Java Embedded Technology

    - by hinkmond
    Whoa, cool! Forbes magazine has an online article about what I've been blogging about all this time: Big Data and Java Embedded Technology, tying it all together with a big bow, connecting small devices to the data center. See: Billions of Java Embedded Devices Here's a quote: By the end of the decade we could see tens of billions of new Internet-connected devices... with billions of Internet- connected devices generating Big Data, are the next big thing. ... That’s why Oracle has put together an ecosystem of solutions for this new, Big Data-oriented device-to-data center world: secure, powerful, and adaptable embedded Java for intelligent devices, integrated middleware... This is the next big thing. Java SE Embedded Technology is something to watch for in the new year. Start developing for it now to get a head-start... Hinkmond

    Read the article

  • Google I/O 2010 - Data migration in App Engine

    Google I/O 2010 - Data migration in App Engine Google I/O 2010 - Data migration in App Engine App Engine 201 Matthew Blain Learn about the App Engine bulk loader and see an example of migrating data from an external data source into the app engine datastore--and back out. Do you have data stored in a traditional, relational DB which you'd like to upload to App Engine? This session will teach you how. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 6 0 ratings Time: 44:26 More in Science & Technology

    Read the article

  • R: How can I use apply on rows of a data.frame and get out $column_name?

    - by John
    I'm trying to access $a using the following example: df<-data.frame(a=c("x","x","y","y"),b=c(1,2,3,4)) > df a b 1 x 1 2 x 2 3 y 3 4 y 4 test_fun <- function (data.frame_in) { print (data.frame_in[1]) } I can now access $a if I use an index for the first column: apply(df, 1, test_fun) a "x" a "x" a "y" a "y" [1] "x" "x" "y" "y" But I cannot access column $a with the $ notation: error: "$ operator is invalid for atomic vectors" test_fun_2 <- function (data.frame_in) { print (data.frame_in$a) } >apply(df, 1, test_fun_2) Error in data.frame_in$a : $ operator is invalid for atomic vectors Is this not possible?

    Read the article

  • Importing Data from Google Analytics

    - by Adam Tannon
    I am planning on building a web app with many different public-facing HTTP servers; each of which will have Google Analytics (GA) installed on them. I'd like to create a "dashboard" app that consolidates the GA data into one screen. I've been perusing the documentation for this so-called GA API, but I can't tell what the end result of the GA API is: Does the GA API allow me to do exactly what I am looking for it to do? Or... Does the GA API do something entirely different (like allow me to share my data with Google+ or something else weird) Since an API can be used to CRUD any kind of data, I guess I'm asking which way the GA API goes: is it for querying (reading) data from 1+ server instances, or is it for modifying data on those servers or somewhere else? Thanks in advance!

    Read the article

  • Using one data source across multiple views in Kendo UI SPA

    - by user3731783
    I am trying to build a Kendo UI SPA. I have two views. View 1 (appListView) shows Application Details in a grid and view 2 (activityView) will have a dropdown for application names and a grid that shows the activity for selected application As I am loading all the application details on the loading of view 1, I would like to re-use those details to populate the dropdown on view 2. Please see my code below. Everything works fine but when I go to View 2 it makes a call to the service again to get application details. I would like to use the existing data if it is already loaded and if the uses comes to view 2 directly then it should get application data also. I am not sure what I am missing in the code. View Markup: <script id="appListView" type="text/x-kendo-template"> <h3 data-bind="html: displayName"></h3> <div data-role="grid" data-editable="{'mode':'popup'}" data-bind="source: items" data-columns="[ {'field': 'Name'}, {'field': 'ContactEmail','title':'Contact Email'} ]"> </div> </script> <script id="" type="text\x-kendo-template"> <div> Activity for Application&nbsp;&nbsp; <input name="AppName" data-role="dropdownlist" data-source="appsModel.items" data-text-field="Name" data-value-field="Id" data-option-label="Choose an application name" style="width:250px;" /> </div> <div id="Activities" data-role="grid" data-bind="source: items" data-auto-bind="false" data-columns="[ {'field': 'Domain','title':'Domain'}, {'field': 'ActivityType','title':'Activity Type'} ]"> </div> </script> js with DataSource and View Model: //data sources var applications = new kendo.data.DataSource({ schema: { model: { id: "Id" } }, serverFiltering : true, transport: { read: { url: '/api/App', dataType: 'json', type:'GET' } } }); var activities = new kendo.data.DataSource({ schema: { model: { id: "Id" } }, transport: { read: { url: '/api/Activity', dataType: 'json', type: 'GET' }, parameterMap: function (data, type) { if (type == "read") { return 'appId=' + $("#AppName").val() ; } } } }); //Models var appsModel = kendo.observable({ items: applications, displayName: 'My Applications' }); var activityModel = kendo.observable({ items: activities, onAppChange: function(t){ $("#Activities").data("kendoGrid").dataSource.read(); }, dispayName: 'Application Activities' }); //views var layout = new kendo.Layout("layout-template"); var appListView = new kendo.View("appListView", { model: appsModel }); var activityView = new kendo.View("activityView", { model: activityModel }); Thank you for taking time to read this long question.

    Read the article

  • Displaying a Sorted, Paged, and Filtered Grid of Data in ASP.NET MVC

    Over the past couple of months I've authored five articles on displaying a grid of data in an ASP.NET MVC application. The first article in the series focused on simply displaying data. This was followed by articles showing how to sort, page, and filter a grid of data. We then examined how to both sort and page a single grid of data. This article looks at how to add the final piece to the puzzle: we'll see how to combine sorting, paging and filtering when displaying data in a single grid. Like with its predecessors, this article offers step-by-step instructions and includes a complete, working demo available for download at the end of the article. Read on to learn more! Read More >

    Read the article

  • How to implement Self-host WCF data serivces (http://localhost:1234/myDataService.svc/...)

    - by warmcold
    I have a project that needs to implement WCF data services (OData) to retrieve data from a control system (.NET Framework Application). The WCF data service needs to be hosted by the .NET application (No ASP.NET and NO IIS). I have seen many WCF Data Service examples recently; they are all hosted by ASP.NET application. I also see the self-host (console application) examples, but it is for WCF Service (not WCF Data Service). Here is my question: It is possible to have a standalone .NET Applications to host WCF Data Services ((http://localhost:1234/mydataservice.svc/...). If yes, can someone provide an example? Thanks.

    Read the article

  • Writing a Data Access Layer (DAL) for SQL Server

    In this tip, I am going to show you how you can create a Data Access Layer (to store, retrieve and manage data in relational database) in ADO .NET. I will show how you can make it data provider independent, so that you don't have to re-write your data access layer if the data storage source changes and also you can reuse it in other applications that you develop. Free trial of SQL Backup™“SQL Backup was able to cut down my backup time significantly AND achieved a 90% compression at the same time!” Joe Cheng. Download a free trial now.

    Read the article

  • URGENT: Patches Needed to Prevent Data Corruption in Oracle Payments

    - by LuciaC
    Development are seeing a number of datafix bugs being logged related to PPR committing data in Payments (IBY) and missing corresponding payments in Payables.  These bugs have been investigated and fixed, however customers need to proactively apply these fixes to prevent data corruption. There are two root cause patches available for this case of partial data commit.  It is critical that all R12/12.1 Payments customers apply the following two patches ASAP: a) Patch 11699958: R12: Error during PPR Leads to Incomplete Data Commit and Inconsistent Status (Doc ID 1338425.1)b) Patches 15867522: Confirmed PPR Batches Show Payment Initiated - Data Exist Only in IBY Tables (Doc ID 1506611.1)

    Read the article

  • The Latest In Master Data Management

    Today master data continues to expand while data quality becomes more important. The challenge of clean data is not new, but the stakes and complexities are higher than ever. Fortunately, Oracle has a solution -- Oracle Master Data Management. Hear from Pascal Laik, VP Oracle MDM Product Strategy about the benefits of Master Data Management, the solutions that Oracle offers and why they are unique and what benefits customers are deriving from Oracle MDM products. Learn about the latest product in the Oracle MDM family and where Oracle MDM strategy is heading.

    Read the article

  • How to store and update data table on client side (iOS MMO)

    - by farseer2012
    Currently i'm developing an iOS MMO game with cocos2d-x, that game depends on many data tables(excel file) given by the designers. These tables contain data like how much gold/crystal will be cost when upgrade a building(barracks, laboratory etc..). We have about 10 tables, each have about 50 rows of data. My question is how to store those tables on client side and how to update them once they have been modified on server side? My opinion: use Sqlite to store data on client side, the server will parse the excel files and send the data to client with JSON format, then the client parse the JOSN string and save it to Sqlite file. Is there any better method? I find that some game stores csv files on client side, how do they update the files? Could server send a whole file directly to client?

    Read the article

  • Get your picture on the screen at MIX11: Help me create a repository of sample data

    - by Laurent Bugnion
    Here is your chance to get your picture on the big screen during my MIX11 presentation in April this year. I need to create a small repository of sample data for my demos. So instead of tapping in my imagination and creating dummy users (or reusing past information I already used in other demos), I thought I would appeal to the amazing community: Send me an email with the following information. I will include the first 30 users into my sample data repository and use your info in my demo. First Name Last Name Date of birth Picture Link to Facebook profile (optional) Disclaimer: The data will only be running locally on my hard drive. The demos will however be filmed and the videos made public. By providing this information, you explicitly consent to this data being used in demos at MIX11 and possibly in following conferences. The data will only be used for demo purposes. Thanks for your help!!   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Using VBA to model data in Autodesk Inventor?

    - by user108478
    I have a close friend who is using a specific device that records the dimensions of an object as it is eroded and outputs the dimensional data to an excel sheet. The object is spherical in nature but is eroded from the top and bottom, so the shape is constantly changing and a single formula for surface area and volume would not work. This is where Inventor comes in. My friend can plug the dimensional data to Inventor and it immediately returns the surface area and volume. The erosion process takes several minutes to complete and records data at very short intervals, so it would be very arduous to plug in the data thousand of time. Since Inventor supports macros and VBA, is there a way to plug the data into Inventor and output it into another spreadsheet? Any suggestions would be appreciated.

    Read the article

  • Silverlight 4 + RIA Services - Ready for Business: Validating Data

      To continue our series lets look at data validation our business applications. Updating data is great, but when you enable data update you often need to check the data to ensure it is valid.  RIA Services as clean, prescriptive pattern for handling this.   First lets look at what you get for free.  The value for any field entered has to be valid for the range of that data type.  For example, you never need to write code to ensure someone didnt type is forty-two into...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Silverlight 4 + RIA Services - Ready for Business: Validating Data

      To continue our series lets look at data validation our business applications. Updating data is great, but when you enable data update you often need to check the data to ensure it is valid.  RIA Services as clean, prescriptive pattern for handling this.   First lets look at what you get for free.  The value for any field entered has to be valid for the range of that data type.  For example, you never need to write code to ensure someone didnt type is forty-two into...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to deal with a ten days debugging session? [on hold]

    - by smonff
    Ten days ago, I fixed a bug on a large application and the hot fix has created a disappearing of some data from the user point of view. Data are not deleted, but have been set to hidden status. It could be possible to get the data back, but the thing seems to be hard: I've already spent 10 days to understand and reproduce the problem (mostly with complex SQL queries but sometimes it is necessary to update the database to test the application logic). Is 10 days a normal amount of time for these kind of problems? Should we keep on and retrieve the data or should we tell these users sorry for the loss, but your data have disappeared? Any advice on when to stop searching how to solve the issue?

    Read the article

  • SQL Server 2008 R2 Reporting Services - The Word is But a Stage (T-SQL Tuesday #006)

    - by smisner
    Host Michael Coles (blog|twitter) has selected LOB data as the topic for this month's T-SQL Tuesday, so I'll take this opportunity to post an overview of reporting with spatial data types. As part of my work with SQL Server 2008 R2 Reporting Services, I've been exploring the use of spatial data types in the new map data region. You can create a map using any of the following data sources: Map Gallery - a set of Shapefiles for the United States only that ships with Reporting Services ESRI Shapefile - a .shp file conforming to the Environmental Systems Research Institute, Inc. (ESRI) shapefile spatial data format SQL Server spatial data - a query that includes SQLGeography or SQLGeometry data types Rob Farley (blog|twitter) points out today in his T-SQL Tuesday post that using the SQL geography field is a preferable alternative to ESRI shapefiles for storing spatial data in SQL Server. So how do you get spatial data? If you don't already have a GIS application in-house, you can find a variety of sources. Here are a few to get you started: US Census Bureau Website, http://www.census.gov/geo/www/tiger/ Global Administrative Areas Spatial Database, http://biogeo.berkeley.edu/gadm/ Digital Chart of the World Data Server, http://www.maproom.psu.edu/dcw/ In a recent post by Pinal Dave (blog|twitter), you can find a link to free shapefiles for download and a tutorial for using Shape2SQL, a free tool to convert shapefiles into SQL Server data. In my post today, I'll show you how to use combine spatial data that describes boundaries with spatial data in AdventureWorks2008R2 that identifies stores locations to embed a map in a report. Preparing the spatial data First, I downloaded Shapefile data for the administrative boundaries in France and unzipped the data to a local folder. Then I used Shape2SQL to upload the data into a SQL Server database called Spatial. I'm not sure of the reason why, but I had to uncheck the option to create a spatial index to upload the data. Otherwise, the upload appeared to run successfully, but no table appeared in my database. The zip file that I downloaded contained three files, but I didn't know what was in them until I used Shape2SQL to upload the data into tables. Then I found that FRA_adm0 contains spatial data for the country of France, FRA_adm1 contains spatial data for each region, and FRA_adm2 contains spatial data for each department (a subdivision of region). Next I prepared my SQL query containing sales data for fictional stores selling Adventure Works products in France. The Person.Address table in the AdventureWorks2008R2 database (which you can download from Codeplex) contains a SpatialLocation column which I joined - along with several other tables - to the Sales.Customer and Sales.Store tables. I'll be able to superimpose this data on a map to see where these stores are located. I included the SQL script for this query (as well as the spatial data for France) in the downloadable project that I created for this post. Step 1: Using the Map Wizard to Create a Map of France You can build a map without using the wizard, but I find it's rather useful in this case. Whether you use Business Intelligence Development Studio (BIDS) or Report Builder 3.0, the map wizard is the same. I used BIDS so that I could create a project that includes all the files related to this post. To get started, I added an empty report template to the project and named it France Stores. Then I opened the Toolbox window and dragged the Map item to the report body which starts the wizard. Here are the steps to perform to create a map of France: On the Choose a source of spatial data page of the wizard, select SQL Server spatial query, and click Next. On the Choose a dataset with SQL Server spatial data page, select Add a new dataset with SQL Server spatial data. On the Choose a connection to a SQL Server spatial data source page, select New. In the Data Source Properties dialog box, on the General page, add a connecton string like this (changing your server name if necessary): Data Source=(local);Initial Catalog=Spatial Click OK and then click Next. On the Design a query page, add a query for the country shape, like this: select * from fra_adm1 Click Next. The map wizard reads the spatial data and renders it for you on the Choose spatial data and map view options page, as shown below. You have the option to add a Bing Maps layer which shows surrounding countries. Depending on the type of Bing Maps layer that you choose to add (from Road, Aerial, or Hybrid) and the zoom percentage you select, you can view city names and roads and various boundaries. To keep from cluttering my map, I'm going to omit the Bing Maps layer in this example, but I do recommend that you experiment with this feature. It's a nice integration feature. Use the + or - button to rexize the map as needed. (I used the + button to increase the size of the map until its edges were just inside the boundaries of the visible map area (which is called the viewport). You can eliminate the color scale and distance scale boxes that appear in the map area later. Select the Embed map data in this report for faster rendering. The spatial data won't be changing, so there's no need to leave it in the database. However, it does increase the size of the RDL. Click Next. On the Choose map visualization page, select Basic Map. We'll add data for visualization later. For now, we have just the outline of France to serve as the foundation layer for our map. Click Next, and then click Finish. Now click the color scale box in the lower left corner of the map, and press the Delete key to remove it. Then repeat to remove the distance scale box in the lower right corner of the map. Step 2: Add a Map Layer to an Existing Map The map data region allows you to add multiple layers. Each layer is associated with a different data set. Thus far, we have the spatial data that defines the regional boundaries in the first map layer. Now I'll add in another layer for the store locations by following these steps: If the Map Layers windows is not visible, click the report body, and then click twice anywhere on the map data region to display it. Click on the New Layer Wizard button in the Map layers window. And then we start over again with the process by choosing a spatial data source. Select SQL Server spatial query, and click Next. Select Add a new dataset with SQL Server spatial data, and click Next. Click New, add a connection string to the AdventureWorks2008R2 database, and click Next. Add a query with spatial data (like the one I included in the downloadable project), and click Next. The location data now appears as another layer on top of the regional map created earlier. Use the + button to resize the map again to fill as much of the viewport as possible without cutting off edges of the map. You might need to drag the map within the viewport to center it properly. Select Embed map data in this report, and click Next. On the Choose map visualization page, select Basic Marker Map, and click Next. On the Choose color theme and data visualization page, in the Marker drop-down list, change the marker to diamond. There's no particular reason for a diamond; I think it stands out a little better than a circle on this map. Clear the Single color map checkbox as another way to distinguish the markers from the map. You can of course create an analytical map instead, which would change the size and/or color of the markers according to criteria that you specify, such as sales volume of each store, but I'll save that exploration for another post on another day. Click Finish and then click Preview to see the rendered report. Et voilà...c'est fini. Yes, it's a very simple map at this point, but there are many other things you can do to enhance the map. I'll create a series of posts to explore the possibilities. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Accessing SharePoint 2010 Data with REST/OData on Windows Phone 7

    - by Jan Tielens
    Consuming SharePoint 2010 data in Windows Phone 7 applications using the CTP version of the developer tools is quite a challenge. The issue is that the SharePoint 2010 data is not anonymously available; users need to authenticate to be able to access the data. When I first tried to access SharePoint 2010 data from my first Hello-World-type Windows Phone 7 application I thought “Hey, this should be easy!” because Windows Phone 7 development based on Silverlight and SharePoint 2010 has a Client Object Model for Silverlight. Unfortunately you can’t use the Client Object Model of SharePoint 2010 on the Windows Phone platform; there’s a reference to an assembly that’s not available (System.Windows.Browser). My second thought was “OK, no problem!” because SharePoint 2010 also exposes a REST/OData API to access SharePoint data. Using the REST API in SharePoint 2010 is as easy as making a web request for a URL (in which you specify the data you’d like to retrieve), e.g. http://yoursiteurl/_vti_bin/listdata.svc/Announcements. This is very easy to accomplish in a Silverlight application that’s running in the context of a page in a SharePoint site, because the credentials of the currently logged on user are automatically picked up and passed to the WCF service. But a Windows Phone application is of course running outside of the SharePoint site’s page, so the application should build credentials that have to be passed to SharePoint’s WCF service. This turns out to be a small challenge in Silverlight 3, the WebClient doesn’t support authentication; there is a Credentials property but when you set it and make the request you get a NotImplementedException exception. Probably this issued will be solved in the very near future, since Silverlight 4 does support authentication, and there’s already a WCF Data Services download that uses this new platform feature of Silverlight 4. So when Windows Phone platform switches to Silverlight 4, you can just use the WebClient to get the data. Even more, if the OData Client Library for Windows Phone 7 gets updated after that, things should get even easier! By the way: the things I’m writing in this paragraph are just assumptions that I make which make a lot of sense IMHO, I don’t have any info all of this will happen, but I really hope so. So are SharePoint developers out of the Windows Phone development game until they get this fixed? Well luckily not, when the HttpWebRequest class is being used instead, you can pass credentials! Using the HttpWebRequest class is slightly more complex than using the WebClient class, but the end result is that you have access to your precious SharePoint 2010 data. The following code snippet is getting all the announcements of an Annoucements list in a SharePoint site: HttpWebRequest webReq =     (HttpWebRequest)HttpWebRequest.Create("http://yoursite/_vti_bin/listdata.svc/Announcements");webReq.Credentials = new NetworkCredential("username", "password"); webReq.BeginGetResponse(    (result) => {        HttpWebRequest asyncReq = (HttpWebRequest)result.AsyncState;         XDocument xdoc = XDocument.Load(            ((HttpWebResponse)asyncReq.EndGetResponse(result)).GetResponseStream());         XNamespace ns = "http://www.w3.org/2005/Atom";        var items = from item in xdoc.Root.Elements(ns + "entry")                    select new { Title = item.Element(ns + "title").Value };         this.Dispatcher.BeginInvoke(() =>        {            foreach (var item in items)                MessageBox.Show(item.Title);        });    }, webReq); When you try this in a Windows Phone 7 application, make sure you add a reference to the System.Xml.Linq assembly, because the code uses Linq to XML to parse the resulting Atom feed, so the Title of every announcement is being displayed in a MessageBox. Check out my previous post if you’d like to see a more polished sample Windows Phone 7 application that displays SharePoint 2010 data.When you plan to use this technique, it’s of course a good idea to encapsulate the code doing the request, so it becomes really easy to get the data that you need. In the following code snippet you can find the GetAtomFeed method that gets the contents of any Atom feed, even if you need to authenticate to get access to the feed. delegate void GetAtomFeedCallback(Stream responseStream); public MainPage(){    InitializeComponent();     SupportedOrientations = SupportedPageOrientation.Portrait |         SupportedPageOrientation.Landscape;     string url = "http://yoursite/_vti_bin/listdata.svc/Announcements";    string username = "username";    string password = "password";    string domain = "";     GetAtomFeed(url, username, password, domain, (s) =>    {        XNamespace ns = "http://www.w3.org/2005/Atom";        XDocument xdoc = XDocument.Load(s);         var items = from item in xdoc.Root.Elements(ns + "entry")                    select new { Title = item.Element(ns + "title").Value };         this.Dispatcher.BeginInvoke(() =>        {            foreach (var item in items)            {                MessageBox.Show(item.Title);            }        });    });} private static void GetAtomFeed(string url, string username,     string password, string domain, GetAtomFeedCallback cb){    HttpWebRequest webReq = (HttpWebRequest)HttpWebRequest.Create(url);    webReq.Credentials = new NetworkCredential(username, password, domain);     webReq.BeginGetResponse(        (result) =>        {            HttpWebRequest asyncReq = (HttpWebRequest)result.AsyncState;            HttpWebResponse resp = (HttpWebResponse)asyncReq.EndGetResponse(result);            cb(resp.GetResponseStream());        }, webReq);}

    Read the article

  • Extract data from specific range of cells in multiple worksheet in multiple files.

    - by Michele
    Extract data from specific range of cells(always the same cells) in multiple worksheet in multiple files. 1 file=1 day. I have 6 technicians each day of the week, Monday thru Friday. So, 5 files with 6 worksheets. I have entered specific info in specific cells of every work sheet. The range is constant(the same address in EVERY worksheet in every file.) So, I need a formula to extract and calculate the data in the given range and dump it into another spreadsheet. I can forward an example a file if it will help anyone to answer my question. Or more explanation if necessary is available upon request. JUST PLEASE SOMEBODY HELP ME!!!!! Thank you all in advance. Regards, Michele

    Read the article

  • How can I recover [data from] my failing USB key?

    - by moe37x3
    I have a Corsair Flash Voyager USB key, and it's almost completely failed. When I plug it into my [WinXP] computer, the OS mounts it and open up explorer to the drive's root directory. However, if I try to copy any data off, I get an error message saying that the device is not there. If I leave it plugged in, the OS seems to oscillate between seeing it and not seeing it, since the "Safely Remove Hardware" tray icon appears and disappears every few seconds. The damage was probably caused by my abuse, either from plugging it in with my keys hanging off of it or from losing the cap and keeping it in my pocket uncapped. Is there anything I can do to save the data from it or even rehabilitate the drive?

    Read the article

  • Changing Word mail merge data source locations in bulk?

    - by Daft Viking
    I've just moved a number of Word mail merge files, and a number of Excel spreadsheets that are the data sources for the mail merges, from a Windows XP computer to a Windows 7 computer, and now all the paths for the merge sources are incorrect (used to be c:\documents and settings\user\my documents.... now c:\users\documents....). While I can correct the path of the data source in each file individually, I was hoping that there would be some way of updating the files in bulk, as there are a relatively large number of them. Word 2007 is what is being used, but the documents are all in the previous DOC format (not DOCX).

    Read the article

  • Does replacing chrome User Data with my own - works without leaving any trace behind? Where else chrome writes data outside of User Data folder?

    - by Selin Peck
    Does replacing chrome User Data with my own - works without leaving any trace behind? Where else chrome writes data outside of User Data folder? I used to start office work by removing chrome User Data, replacing it with my own User Data copied from my external drive, saving the original User Data to other folder. Before leaving in the evening, I will take back my own User Data, and bring back the original User Data where it is originally saved. Is this process advisable? Would I be safe this way or if not, where else does chrome save data outside of User Data folder in AppData? Also, how is the process in Mozilla Firefox?

    Read the article

  • How do I populate multiple records of data into a PDF form like a mail-merge?

    - by user38801
    I have Acrobat Pro, and I have a PDF with a form on it. Assuming the fields in the form correspond to a data source (like rows in an RDBMS table or xml file), I want to then print multiple copies of the PDF file, with each copy having the values of a different row in the data source. It is preferable to directly interface with an actual database, rather than having to save an XML file every time I do this. If this involves programming that's cool too, I only posted here because the question didn't seem appropriate for StackOverflow. Thanks!

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >