Search Results

Search found 47251 results on 1891 pages for 'web storage'.

Page 936/1891 | < Previous Page | 932 933 934 935 936 937 938 939 940 941 942 943  | Next Page >

  • Win 7 Explorer backup and long paths

    - by user53299
    I use Explorer to do backups because Win 7's backup program asks me to take backups previously done and to put them back in the drive. I am opposed to that idea since I believe backups should remain in storage. With Explorer backups (burn and burn to disc) I have encountered the "destination path too long" error message and it shows the name of a folder "Debug" three times. I have hundreds of folders named "Debug" thanks to Visual Studio. At this moment I'm too angry at Microsoft to write a program to determine my 3 longest paths. (Aside: This is all after coincidentally reading two articles about path junctions earlier this evening which already made me kind of unhappy.) Please, is there an easy way to continue to make backups with Explorer? Edit: I should add that renaming paths wrecks Visual Studio projects so I really need to isolate the small number of problem paths or find a cleaner solution.

    Read the article

  • Ways to increase my Ubuntu partition space

    - by Andreas Grech
    I am currently running Ubuntu and Windows 7 as dual-boot on a single HD. The problem is that when I installed Ubuntu, I didn't allocate as much space as I thought I would need and now I need 'reinstall' Ubuntu so that I can increase the amount of storage space. Now there are two ways to go about this. Either I use use gparted to increase my partition space (but I read that it's not really that safe as regards data loss) or create the new partition with more space and reinstall Ubuntu there. But if want to reinstall Ubuntu, is there a way I can somehow "save" my current Ubuntu and install that one? What I mean is that I don't want to lose my current installed packages and files that I have on this partition. Is there a way to kind of maybe 'streamline' my current Ubuntu so that I install this one on the new partition? If not, what are your opinions as regards gparted?

    Read the article

  • Where can I find different versions of Lucene.Net Analyzer

    - by Vinay Pandey
    Hi All, I know its silly question but I am struggling in allowing japanese/other such languages search for my web application using lucene.net. I know that different analyzers can be used for all different languages and can be implemented but I could not find any dll for analyzers or example for the same. the question is:- Will using different analyzers be a good option for web application, as search text can be in any form. Where can I find dll and sample application for implementing search for all different sets of language I have spend whole day but no luck :(.

    Read the article

  • IIS Virtual Directory/Application & Forms authentication

    - by user216194
    I've setup and deployed a simple forms authentication website with membership using .NET 4. I've created a virtual directory (now converted to "Application") in IIS7 and setup the web.config file in the virtual directory as follows: <system.webServer> <directoryBrowse enabled="true" /> </system.webServer> Great! I browse to the virtual directory: ../mydomain/books/ and I'm automatically redirected to the login page specified by web.config in my root directory and the url path is placed as follows: ../Account/Login.aspx?ReturnUrl=%2fbooks At this point, I login succesfully, but I am not redirected anywhere, and when I manually return to the directory, ../books, I'm sent back to the login page, where I'm already logged in? So I'm confused about what my problem is! I should be successfully authenticated, and than redirected back to the directory, or at the very least be able to view it manually after I log in right?

    Read the article

  • Optimal Configuration for five 300 GB 15K SAS Drives

    - by Bob
    I recently acquired an HP Z800 workstation that has five 300 GB 15K SAS Drives. This system will be dedicated to running multiple virtual machines under VMware Workstation (Note: I'm not using ESXi because I do plan to use the system for other purposes.). For the host OS, I plan to install RHEL 5. My number one concern is guest performance. For example, should I create a RAID 10 array for the OS and virtual machine storage with four of the drives and reserve the 5th? Or, is there a solution that will provide better performance?

    Read the article

  • How can I improve performance over SMB/CIFS for an application that has poor write speeds?

    - by Jeremy
    I have a third party application that reads several large files and generates a third large file. Its performance is quite good when the generated file is stored on "local storage", i.e. either a direct attached or iSCSI-based disk. The source files that are read can be stored remotely on our NAS and accessed via SMB with little effect on performance. However, if we attempt to write the target file to any kind of SMB/CIFS share (Samba or Windows Server) the performance drops almost ten-fold. This is unacceptably slow in our case. Writing files to network shares is not otherwise slow. I can copy large files to SMB shares and get great performance - near what I would expect is possible given the disks and network in question. I have a theory that this application's problem with SMB shares has something to do with a lack of write caching over the share and perhaps lots of network roundtrips. Is this possible and is there anything that can be done about it?

    Read the article

  • asp.net Multiple Page_Load events for a user control when using URL Routing

    - by Paul Hutson
    Hello, I've recently set up an ASP.net site (not using MVC.net) to use URL Routing (more on the code below) - when using user controls on the site (i.e I've created a "menu" user control to hold menu information) the page_load event for that control will fire twice when URLs have more than one variable passed over. i.e. pageName/VAR1 : will only fire the page_load event once. while pageName/VAR1/VAR2 : will fire the page_load event twice. *Multiple extra VARs added on the end will still only fire the page_load event twice*. Below are the code snippits from the files, the first is the MapPageRoute, located in the Global.asax : // Register a route for the Example page, with the NodeID and also the Test123 variables allowed. // This demonstrates how to have several items linked with the page routes. routes.MapPageRoute( "Multiple Data Example", // Route name "Example/{NodeID}/{test123}/{variable}", // Route URL - note the NodeID bit "~/Example.aspx", // Web page to handle route true, // Check for physical access new System.Web.Routing.RouteValueDictionary { { "NodeID", "1" }, // Default Node ID { "test123", "1" }, // Default addtional variable value { "variable", "hello"} // Default test variable value } ); Next is the way I've directed to the page in the menu item, this is a list item within a UL tag : <li class="TopMenu_ListItem"><a href="<%= Page.GetRouteUrl("Multiple Data Example", new System.Web.Routing.RouteValueDictionary { { "NodeID", "4855" }, { "test123", "2" } }) %>">Example 2</a></li> And finally the control that gets hit multiple times on a page load : // For use when the page loads. protected void Page_Load(object sender, EventArgs e) { // Handle the routing variables. // this handles the route data value for NodeID - if the page was reached using URL Routing. if (Page.RouteData.Values["NodeID"] != null) { nodeID = Page.RouteData.Values["NodeID"] as string; }; // this handles the route data value for Test123 - if the page was reached using URL Routing. if (Page.RouteData.Values["Test123"] != null) { ExampleOutput2.Text = "I am the output of the third variable : " + Page.RouteData.Values["Test123"] as string; }; // this handles the route data value for variable - if the page was reached using URL Routing. if (Page.RouteData.Values["variable"] != null) { ExampleOutput3.Text = "I say " + Page.RouteData.Values["variable"] as string; }; } Note, that when I'm just hitting the page and it uses the default values for items, the reloads do not happen. Any help or guidance that anyone can offer would be very much appreciated! EDIT : The User Control is only added to the page once. I've tested the load sequence by putting a breakpoint in the page_load event - it only hits twice when the extra routes are added. Thanks in Advance, Paul Hutson

    Read the article

  • Save BLOB to disk as Image C#

    - by TGuimond
    Hello, I am developing a web application that calls web services developed by a third party to send/receive data from a clients database. I am building the application using ASP.NET 3.5 C#. The way that they are providing me images is in BLOB format. I call their method and what I get back is a DataSet with 2 fields "logo_name" and "logo" which is the BLOB field. What I want to do is: Save the image locally on the disk (to minimize calls to their database in the future). I have been playing around with a couple of different ways of doing this but cannot seem to get the image to save correctly. I have been able to create a file in a folder with the correct name but if I try to view the image it does not work. I was hoping someone could give me some sample code to show me how to save a BLOB field to the local disk? Thanks

    Read the article

  • What ASP.NET MVC project files should I not add to Subversion

    - by Dan
    this is likely a naive question, but I want to do this right the first time. I have a MVC solution which has the following: Data project - C# Services project - C# MVC Web Project - ASP.NET MVC Test Project Currently, I am using the MVC2 source as a means to debug my own code. I do not plan on checking that in, but I realize once I go back to the MVC2 DLL, my solution will change. I'm pretty sure I just shouldn't check in stuff that changes with each build: the bin folder on the Web project, for example. Is there a list of what not to commit to source control? :)

    Read the article

  • Reliable 1tb or larger hard drive?

    - by jasondavis
    I am in the market for 2-3 new drives, I would like each to be at least 1tb to 2tb in size. I have been reading all the reviews on newegg.com for 1tb and larger drives and they all have 1 thing in common. Almost all the ones I read about have complaints of them being DOA or dieing within a few weeks of use. I am hoping to find some drives with this storage range that have a reputation for lasting a long time instead of a short life. Please help me if you have any experience with these sort of drives? Most the ones I read about were Western Digital brand. I realize some might complain that this questions answer would be based upon a timeframe, so if a user searches and find this answer a year from now it will be outdated but I would appreciate any help based on the current hard drives available as of April 10th, 2010 on newegg.com

    Read the article

  • Transfer many Gigabytes between two servers

    - by Bernhard
    Hello, I have a big problem. I have to move data from an old Webspace which is only accessibla by ftp. The new root server is accessible by ssh of course :-) I need to move all the data from the old space but the amount is just huge. Is there a way to move all the files directly from the old ftp to the storage and not over a third station (my local machine)? I´ve tried it with ftp but it didn't work. I think I´ve used the wrong commands. Is there a way to do this? Thank you in advance Bernhard

    Read the article

  • PowerShell create new Azure VM from uploaded disk (not image)

    - by MikeBaz
    I have a VHD in Azure storage. That VHD is configured as an OS disk through a command like the following: Add-AzureDisk -DiskName $newCode -MediaLocation "http://$script:accountName.blob.core.windows.net/$newCode/$sourceVhdName.vhd" ` -Label $newCode -OS "Windows" I would like to create a new VM pointing at that disk. From what I can tell if I was doing this with an image I would do something like: New-AzureVMConfig -Name $newCode -InstanceSize $instanceSize ` -MediaLocation "http://$script:accountName.blob.core.windows.net/$newCode/$sourceVhdName.vhd" -ImageName $newCode ` | Add-AzureProvisioningConfig -Windows -Password $adminPassword ` | New-AzureVM -ServiceName $newCode However this is wrong for me because I don't have an image - I have a configured VHD that is not sysprepped and can't be. How can I create the VM in PowerShell to point at the existing disk like I can through the portal?

    Read the article

  • Updating Silverlight with data. JSON or WCF?

    - by Alastair Pitts
    We will be using custom Silverlight 4.0 controls on our ASP.NET MVC web page to display data from our database and was wondering what the most efficient method was? We will be having returned values of up to 100k records (of 2 properties per record). We have a test that uses the HTML Bridge from Javascript to Silverlight. First we perform a post request to a controller action in the MVC web app and return JSON. This JSON is then passed to the Silverlight where it is parsed and the UI updated. This seems to be rather slow, with the stored procedure (the select) taking about 3 seconds and the entire update in the browser about 10-15sec. Having a brief look on the net, it seems that WCF is another option, but not having used it, I wasn't sure of it's capability or suitability. Does anyone have any experiences or recommendations?

    Read the article

  • User reactions to WYSIWYM

    - by David
    I am trying to decide between a WYSIWYG editor (e.g. TinyMCE, CKEditor) and a WYSIWYM (What You See Is What You Mean) editor (e.g. WMD) for my web application. There is a thread on stackoverflow that compares the two approaches. I would like to know how users, particularly computer novices, have reacted to WYSIWYM editors in deployed web applications. It could be that computer novices are confused by WYSIWYM editors, preferring the immediacy of WYSIWYG; but is that born out in real-world applications? It's not theory I'm asking about here, but empirical evidence of the acceptance or otherwise of WYSIWYM.

    Read the article

  • Using Folder Redirection GPO and Offline Files and Folders

    - by user132844
    I want to use Folder Redirection to redirect user's My Documents to a network share. First question is: What is best practices for mapping the drive? Should I use the profile tab in AD with the %username% variable, or a net use logon script, or something else? Second question is: How do I deal with laptops and syncing the network with the local storage? I want to have 2-way syncing so if they manually map their networked home drive and edit it from a different computer, it will sync the newer version to their My Documents folder the next time they connect their normal work computer. I also want to be sure that if they edit a file offline on their laptop while away from the office, that the network version syncs the changes the next time they connect that laptop. Please advise best practices for this scenario in a 2008 R2/Win7 environment. I am also interested in Mac clients for this environment, and while I am very Mac savvy, I would like to hear what others consider to be best practices for Mac network homedirs in a Win environment.

    Read the article

  • How to embed a font in website

    - by Mayur
    Hi All, I am trying to embed my custom font in my web site, got a link folder from http://www.fontsquirrel.com/fontface/generator this site after uploading a font on this site, it gives me a css @font-face { font-family: 'VoltaEFTU-Regular'; src: url('voltaeftu-regular-webfont.eot'); src: local('?'), url('voltaeftu-regular-webfont.woff') format('woff'), url('voltaeftu-regular-webfont.ttf') format('truetype'), url('voltaeftu-regular-webfont.svg#webfonttKmU3jX8') format('svg'); font-weight: normal; font-style: normal; } but its not working in my machines how can i embed a font in my web site.... Thanks Mayur Mate

    Read the article

  • Serverlocation moved and how can I Move the files

    - by Bernhard
    Hello together, I´ve a big problem. I have to move data from an old Webspace which is only accessibla by ftp. No we have a new root server which is accessible by ssh of course :-) No i Need to move all data from the old space but there is a lot of Gb of files. Is there a way to fetch all files directly from the old ftp to the storage and not over a third station (my local machine)? I´ve tried it with ftp but without success. I think I´ve used the wrong commands. Is there a way to etablish something like this including all files and directorys? Thank you in advance Bernhard

    Read the article

  • Blackberry Widget Packager saves my files in strange places

    - by chibineku
    I have just installed the Blackberry Widget Packager and Blackberry Web Plug-In for Eclipse, and everything works fine, but my files are output to strange places. For example, I tried putting my zipped source files in the folder Blackberry Widget Packager/web and I got an error during packaging. Packaging works when the .zip is in the same directory as wwbp, though. When a widget is successfully created, the executable .jar file and the .rapc files are put in some stupid folder like users/user/temp/widgetname094098456, and the other files are split between two folders in Blackberry Widget Packager/bin. This is slightly annoying as I don't want to be spending time herding my files. Anyone have any thoughts on why my files are being scattered like this?

    Read the article

  • Installing CXF with Eclipse 3.5

    - by shinynewbike
    As per http://help.eclipse.org/helios/topic/org.eclipse.jst.ws.cxf.doc.user/reference/preferences.html The CXF 2.x preferences can be accessed via Window > Preferences... > Web Services > CXF 2.x Preferences from the top level menu. but I dont see the option CXF 2.x Preferences under Web Services, though I have chosen JavaEE perspective. any ideas how to enable this? Sorry for such a simple question. I also cannot do not see CXF as a Project Facet as per http://help.eclipse.org/helios/index.jsp?topic=/org.eclipse.jst.ws.jaxws.doc.user/gettingstarted/requirements.html I know this is something to do with getting Eclipse aware of CXF libraries, but cant find the tutorial for this.

    Read the article

  • Command-line access for Apple Time Machine?

    - by Stefan Lasiewski
    We use Apple's Time Machine to back up our workstations at the office. If I want to restore a file, I need to open up the Time Machine GUI and browse files there. The GUI is ugly eye-candy and gets in my way. Is there a way to browse the Time Machine archive using the Mac's command-line? I'm used to Netapps and other storage appliances. I use backintime for my Ubuntu workstation. To restore a file with one of those systems, you can restore a file with a simple command like: cp .snapshot/daily.0/filename.txt . or cp /backup/backintime/20100611-000002/backup/etc/shadow /etc/shadow Is there an equivalent for Apple's Time Machine?

    Read the article

  • How to Disable secondary drive from booting upon restart - Windows

    - by DevCompany
    I had a Windows 2003 Hard Drive on my server and it went bad so I installed a new clean hard drive and installed Windows 2008 R2 on the new clean drive. I moved the old 2003 drive to be used only for general storage on the same computer. It usually boots into Windows 2008 upon a restart, but just sometimes it starts trying to boot the old 2003 drive and causes boot issues(NTDLR Bootloader, and other errors), even though the order of boot preference is set to boot 2008, and NOT 2003. I need to know how to remove any old code that keeps this old drive as a bootable drive. I still want to use it as a secondary drive just dont want to have any boot code on it. hopefully my situation is clear for everyone to get a good response. Thank you...

    Read the article

  • RHEL raw device (over VMware RDM) performance issues

    - by jifa
    I'm running RHEL 5.3 over vSphere 4.0U1. I configured multiple LUNs on my NetApp (Fibre) storage, and added the RDM on two (Linux) VMs, using the Paravirtual SCSI adapter. One LUN is 100GB in size, successfully mapped to /dev/sdb on both VMs, 5 more are 500MB in size (mapped to /dev/sd{c-g}. I also created one partition per device. I have encountered two issues: First, writing directly to /dev/sdb1 gives me ~50MB/s, while any of the /dev/sd{c-g}1 gives me ~9MB/s. There is no difference in configuration of the LUNs apart from their size. I am wondering what causes this but this is not my main problem, as I would settle for 9 MB/s. I created raw devices using udev pretty straightforwardly: ACTION=="add", KERNEL=="sdb1", RUN+="/bin/raw /dev/raw/raw1 %N" per device Writing to any of the new raw devices dramatically slows down performance to just over 900KB/s. Can anyone point me in a helpful direction? Thanks in advance, -- jifa

    Read the article

  • Help me find some good 'Reflection' reading materials??

    - by IbrarMumtaz
    If it wasn't you guys I would've never discovered Albahari.com's free threading ebook. My apologies if I come off as being extremely lazy but I need to make efficient use of my time and make sure am not just swallowing any garbage of the web. Therefore I am looking for the best and most informative and with a fair bit of detail for the Reflection chapter in .Net. Reflection is something that comes up time and time again and I want to extend my reading from what I know already from the official 70-536 book. I'm not a big fan of MSDN but at the moment that's al I'm using. Anyone got any other good published reading material off the inter web that can help revision for the entrance exam??? Would be greatly appreciated !!! Thanks in Advance, Ibrar

    Read the article

  • Network adapters reliability

    - by casey_miller
    Can you help me with understanding of reliability of network adapters. Most of the time servers do have at least 2 NIC's bonded to provide sort of a HA for it. So in case of one NIC fails, the second would still do the job. I wonder which factors work when you use network adapters. I know that, the most important and weakest part of any computer system is: storage (i.e HDD). but how reliable actually network adapters are? There are more expensive ones, and cheaper adapters. In which cases do they actually fail? In what circumstances. May it be a intensive usage of them Time when it's on In your experience how often you found yourself changing NIC's due to their fail? Or just what's the typical lifetime of commodity NIC's? thanks.

    Read the article

< Previous Page | 932 933 934 935 936 937 938 939 940 941 942 943  | Next Page >