Search Results

Search found 42367 results on 1695 pages for 'resource files'.

Page 10/1695 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • PNG files are not being displayed in IE and Vista's Sidebar

    - by Wbdvlpr
    Hi, I am using Windows Vista. I was just visiting a website and found some "missing" images boxes on webpages. And, I could see this for many of the websites I visited. Then I realised that only a certain type of image files are not being displayed, which is PNG. I restarted my computer and noticed that 2 of the sidebar gadgets were missing background images. The websites with "missing" images are working fine in Firefox though. So its a problem related to IE and some of the Windows files. Any ideas how do I get PNGs working in my IE and Sidebar etc. Thanks.

    Read the article

  • how can i use apache log files to recreate usage scenario

    - by daigorocub
    Recently i installed a website that had too many requests and it was too slow. Many improvements have been made to the web site code and we've also bought a new server. I want to test the new server with exactly the same requests that made the old server slow. After that, i will double the requests, make new tests and so on. These requests are logged in the apache log files. So, I can parse those files and make some kind of script to make the same requests. Of course, in this case, the requests will be made only by my computer against the server, but hey, better than nothing. Questions: - is there some app that does this already? - would you use wget? ab? python script? Thanks!

    Read the article

  • Files not accessible

    - by gokul
    My system is running on a pc with C:\ Drive out of space. So I tried to delete some file and clean up to get more space. I found that the %Temp% {C:\Users\Username\AppData\Local\Temp} takes lots of space and tried to delete files in it. But when I open it , it alerted me with the message C:\Users\Username\AppData\Local\Temp is not accessible The file or directory is corrupted and unreadable? What to do? Is deleting files from Temp harmful to computer?

    Read the article

  • Unable to see hidden files

    - by TheGoodUser-Sp
    I have BitDefender (With last update) on my Windows-7. I want to see hidden files, so from Tools > Folder Options > View , I change the settings as below, and click on OK: But I can't see hidden files. when I double check the Options, I see the settings changed automatically as below : I know this is a virus-like application! checking by a virus total via ProcessExplorer didn't help : How can I understand with process is related to this issue?

    Read the article

  • Conflicts with MS Office temporary files when using Offline Folders on Vista

    - by Tambet
    We are using Offline Folders feature of Windows Vista to make files on network shares available when out of office. Mostly it is working, but every time I do a sync I get a lot of such errors: D500E7B8.tmp - A file was deleted on this computer and changed on the server while this computer was offline. There are hundreds of them. I always select all of them and choose resolution "Delete from both locations". But what is causing this and how can I avoid it? I suspect the reason is that we are using Debian and Samba (3.4.7) on our file server. I've been looking for some Samba options that would cure this, but with no success. I learned that probably the cause is, that both Word and Excel are using specific pattern to change files - they never change the original file, but instead always write a new temporary file and rename it to original file, when you click Save. This is documented here: http://support.microsoft.com/kb/211632/?FR=1.

    Read the article

  • Conflicts with MS Office temporary files when using Offline Folders on Vista

    - by Tambet
    We are using Offline Folders feature of Windows Vista to make files on network shares available when out of office. Mostly it is working, but every time I do a sync I get a lot of such errors: D500E7B8.tmp - A file was deleted on this computer and changed on the server while this computer was offline. There are hundreds of them. I always select all of them and choose resolution "Delete from both locations". But what is causing this and how can I avoid it? I suspect the reason is that we are using Debian and Samba (3.4.7) on our file server. I've been looking for some Samba options that would cure this, but with no success. I learned that probably the cause is, that both Word and Excel are using specific pattern to change files - they never change the original file, but instead always write a new temporary file and rename it to original file, when you click Save. This is documented here: http://support.microsoft.com/kb/211632/?FR=1.

    Read the article

  • Unable to delete files in Temporary Internet Files folder

    - by Johnny
    I'm on Win7. I have a large number of of large .bin files, totaling 183GB, in my Temporary Internet Files folder. They all seem to come from video sharing sites like youtube. The files are invisible in Explorer even after allowing viewing of hidden files. The only way I can see them is by issuing "dir /fs" on the command line. Now when I try to delete them from the command line nothing happens. Trying to delete the whole folder from Explorer results in access denied because another process is using a file in the folder (IE is not running while I'm doing this). Trying to clear the folder using IE is also unsuccessful. How do I delete these files? How did they end up being there without being deleted by IE?

    Read the article

  • Disk fragmentation when dealing with many small files

    - by Zorlack
    On a daily basis we generate about 3.4 Million small jpeg files. We also delete about 3.4 Million 90 day old images. To date, we've dealt with this content by storing the images in a hierarchical manner. The heriarchy is something like this: /Year/Month/Day/Source/ This heirarchy allows us to effectively delete days worth of content across all sources. The files are stored on a Windows 2003 server connected to a 14 disk SATA RAID6. We've started having significant performance issues when writing-to and reading-from the disks. This may be due to the performance of the hardware, but I suspect that disk fragmentation bay be a culprit at well. Some people have recommended storing the data in a database, but I've been hesitant to do this. An other thought was to use some sort of container file, like a VHD or something. Does anyone have any advice for mitigating this kind of fragmentation?

    Read the article

  • Storing source files outside project file directory in Visual Studio C++ 2009

    - by Skurmedel
    Visual Studio projects assumes all files belonging to the project are situated in the same directory as the project file, or one underneath it. For a particular project (in the non-Visual Studio sense) this is not what I want. I want to store the MSVC-specific files in another folder, because there might be other ways to build the application as well, for example with SCons. Also all the stuff MSVC splurts out clutters the source directory. Example: /source /scons /msvc <- here is where I want my MSVC-specific stuff I can add the files, in Explorer, to the source directory manually, and then link them in Visual Studio with the project. It's not the end of the world, but it annoys me a bit that Visual Studio tries to dictate the folder structure of my project. I was looking through the schemas for the project files but realized that this annoying assumption is in the IDE and not the format of the project files. Do someone know a neater way to solve this than manually linking files to the project from the source directory?

    Read the article

  • Delete temporary files from batch script in xp

    - by Keith Bentrup
    I'm looking for a good batch script that would quickly find & clean all the known safe temporary folders/files from Windows (as many variants as possible) machines (e.g. the windows temp folder, all users IE temp folders, etc.). I'm fond of UI tools like CCleaner (over Cleanmgr.exe), but when I'm trying to clean several computers quickly and/or with minimal involvement, it would be nice to have a script. Plus with a script, I could chain several scripts together. Maybe one to then fire up various antivirus and/or malware detectors. Anyone have a good one or can point to a good resource?

    Read the article

  • Delete temporary files from batch script in xp

    - by Keith Bentrup
    I'm looking for a good batch script that would quickly find & clean all the known safe temporary folders/files from Windows (as many variants as possible) machines (e.g. the windows temp folder, all users IE temp folders, etc.). I'm fond of UI tools like CCleaner (over Cleanmgr.exe), but when I'm trying to clean several computers quickly and/or with minimal involvement, it would be nice to have a script. Plus with a script, I could chain several scripts together. Maybe one to then fire up various antivirus and/or malware detectors. Anyone have a good one or can point to a good resource?

    Read the article

  • Expire Files In A Folder: Delete Files After x Days

    - by Brett G
    I'm looking to make a "Drop Folder" in a windows shared drive that is accessible to everyone. I'd like files to be deleted automagically if they sit in the folder for more than X days. However, it seems like all methods I've found to do this, use the last modified date, last access time, or creation date of a file. I'm trying to make this a folder that a user can drop files in to share with somebody. If someone copies or moves files into here, I'd like the clock to start ticking at this point. However, the last modified date and creation date of a file will not be updated unless someone actually modifies the file. The last access time is updated too frequently... it seems that just opening a directory in windows explorer will update the last access time. Anyone know of a solution to this? I'm thinking that cataloging the hash of files on a daily basis and then expiring files based on hashes older than a certain date might be a solution.... but taking hashes of files can be time consuming. Any ideas would be greatly appreciated! Note: I've already looked at quite a lot of answers on here... looked into File Server Resource Monitor, powershell scripts, batch scripts, etc. They still use the last access time, last modified time or creation time... which, as described, do not fit the above needs.

    Read the article

  • Mac terminal: Resource temporarily unavailable

    - by user167108
    I'm getting an error message in the Mac Terminal when I try to run several different processes. I did some googling and looking on this site, and found out that it might be related to having too many processes running at one time. However, I'm getting these error messages when I only have a few windows open (much fewer than I was accustomed to having). Looking in activity Monitor, my %User number is at around 25%, and the %System number is around 15%. In the past, I have had both much much higher (until the people at the Apple store told me to keep an eye on it). So with these numbers lower now, what explains the Resource temporarily unavailable error message? heroku (cloud hosting) console -bash: fork: Resource temporarily unavailable -bash-3.2$ upon opening new window in the terminal sh: fork: Resource temporarily unavailable sh: fork: Resource temporarily unavailable trying to run -bash: fork: Resource temporarily unavailable

    Read the article

  • undelete big files - mission impossible?

    - by johnrembo
    Hi, I've accidentaly deleted outlook.pst (6.7GB) file, while there was only 400MB free space left on primary NTFS partition (winxp). I've tried several recovery tools to get this file back. "Ontrack Easy Recovery Pro" found 0 pst files (complete scan mode), while "Recover My Files" in sector scan mode found 5 pst's, but 4 of them of sizes from 3 to 28 KB, while the 5th one - 1Gb. I've managed to succesfuly recover 1Gb pst file, which was 1 year old copy (the one used after the latest windows reinstall). Now, I'm frustrated and confused Why 1 year old file was succesfuly recovered if there were only 400MB left on primary partition? Where's 6.7GB file gone? I did some reading (i.e. here), and it seems that there's almost no probability to retrieve the file I'm looking for, but wait - none of recovery tools i've used found zero-sized pst file, moreover - if due to fragmentation a file might be corrupted - we could use scanpst.exe to fix some errors and survive with 10 or 100 emails missing - whatever. Could you please recommend some more sophisticated recovery tools for this particular task? Appretiate your help - thanks in advance

    Read the article

  • Disk fragmentation when dealing with many small files

    - by Zorlack
    On a daily basis we generate about 3.4 Million small jpeg files. We also delete about 3.4 Million 90 day old images. To date, we've dealt with this content by storing the images in a hierarchical manner. The heriarchy is something like this: /Year/Month/Day/Source/ This heirarchy allows us to effectively delete days worth of content across all sources. The files are stored on a Windows 2003 server connected to a 14 disk SATA RAID6. We've started having significant performance issues when writing-to and reading-from the disks. This may be due to the performance of the hardware, but I suspect that disk fragmentation may be a culprit at well. Some people have recommended storing the data in a database, but I've been hesitant to do this. An other thought was to use some sort of container file, like a VHD or something. Does anyone have any advice for mitigating this kind of fragmentation? Additional Info: The average file size is 8-14KB Format information from fsutil: NTFS Volume Serial Number : 0x2ae2ea00e2e9d05d Version : 3.1 Number Sectors : 0x00000001e847ffff Total Clusters : 0x000000003d08ffff Free Clusters : 0x000000001c1a4df0 Total Reserved : 0x0000000000000000 Bytes Per Sector : 512 Bytes Per Cluster : 4096 Bytes Per FileRecord Segment : 1024 Clusters Per FileRecord Segment : 0 Mft Valid Data Length : 0x000000208f020000 Mft Start Lcn : 0x00000000000c0000 Mft2 Start Lcn : 0x000000001e847fff Mft Zone Start : 0x0000000002163b20 Mft Zone End : 0x0000000007ad2000

    Read the article

  • SQL SERVER – Fix: Error: 10920 Cannot drop user-defined function. It is being used as a resource governor classifier

    - by pinaldave
    If you have not read my SQL SERVER – Simple Example to Configure Resource Governor – Introduction to Resource Governor yesterday’s detailed primer on Resource Governor, I suggest you go ahead and read it before continuing this article. After reading the article the very first email I received was as follows: “Pinal, I configured resource governor on my development server and it worked fine with tests I ran. After doing some tests, I decided to remove the resource governor and as a first step I disabled it however, I was not able to drop the classification function during the process of the clean up. It was continuously giving me following error. Msg 10920, Level 16, State 1, Line 1 Cannot drop user-defined function myudfname. It is being used as a resource governor classifier. Would you please give me solution?” The original email was really this short and there is no other information. I am glad he has done experiments on development server and not on the production server. Production server must not be the playground of the experiments. I think I have covered the answer of this error in an earlier blog post. If the user disables the Resource Governor it is still not possible to drop the function because it can be enabled again and when enabled it can still use the same function. Here is the simple resolution of the how one can drop the classifier function (do this only if you are not going to use the function). The reason the classifier function can’t be dropped because it is associated with resource governor. Create a new classified function for your resource governor or just assign NULL as described in the following T-SQL Script and you will be able to drop the function without error. ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION = NULL) GO ALTER RESOURCE GOVERNOR DISABLE GO DROP FUNCTION dbo.UDFClassifier GO I am glad that user asked me question instead of doing something radically different, which can leave the server in the unusable state. I am aware of this only method to avoid this error. Is there any better way to achieve the same? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Error Messages, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Part 7: EBS Modifications and Flagged Files in R12

    - by volker.eckardt(at)oracle.com
    Let me, based on my previous blog, explain the procedure of flagged files a bit better and facilitate the same with screenshots. Flagged files is a concept within the Oracle eBusiness Suite (EBS) release 12, where you flag a standard deployment file, let’s say a Forms file, a Package or a Java class file. When you run the patch analyse, the list of flagged files will be checked and in case one of these files gets patched, the analyse report will tell you. Note: This functionality is also available in release 11, here it is implemented and known as “applcust.txt”. You can flag as many files as you want, in whatever relationship they are with your customizations. In addition to the flag itself you can add a comment. You should use this comment to point to your customization reference (here XXAR_RPT_066 or XXAP_CUST_030). Consider the following two cases: You have created your own report, based on a standard report. In this case you will flag the report file itself, and the key views used. When a patch updates one of these files, you will be informed and can initiate a proper review and testing. (ex.: first line for ARXCTA.rdf) You have created an extensive personalization and because it is business critical you like to be informed if the page definition gets updated. In this case you register the PG.xml file as flagged file. (ex.: second line below for CreateExtBankAcctPG.xml) The menu path to register flagged files is the following: (R) System Administrator > (M) Oracle Applications Manager > Site Map > Maintenance > Register Flagged Files     Your DBA should now run the Patch Analyse every time he is going to apply a new patch. (R) System Administrator > (M) Oracle Applications Manager > Patch Wizard > Task “Recommend/Analyze Patches” The screenshot above shows the impact summary. For this blog entry the number “2” titled “Flagged Files Changed“ is in our focus. When you click the “2” you will get a similar screen like the first in this blog, showing you exactly the files which will get patched if you continue and apply this patch in this environment right now. Note: It is also shown that just 20% of all patch files will get applied. This situation might be different in case your environments are on a different patch level. For sure also the customization impact might then be different. The flagging step can be done directly in the Oracle Applications Manager.  Our developers are responsible for. To transport such a flag+comment we use a FNDLOAD script. It is suggested to put the flagged files data file directly into your CEMLI patch. Herewith the flagged files registration will be executed right at the same time when the patch gets applied. Process Steps: Developer: Builds CEMLI Reviews code and identifies key standard objects referenced Determines standard object files and flags them Creates FNDLOAD file and adds the same to the CEMLI patch DBA: Executes for every new Oracle standard patch the patch analyse in a representative environment Checks and retrieves the flagged files and comments Sends flagged file list back to development team for analyse / retest Developer: Analyses / Updates / Retests effected CEMLIs Prerequisite: The patch analyse has to be executed in an environment where flagged files have been registered. (If you run the patch analyse in a vanilla or outdated environment (compared to your PROD), the analyse will not be so helpful!) When to start with Flagged files? Start right now utilizing this feature. It is an invest to improve the production stability and fulfil your SLA!   Summary Flagged Files is a very helpful EBS R12 technique when analysing patches. Implement a procedure within your development process to maintain such flags. Let the DBA run the patch analyse in an environment with a similar patch and customization level as your current production.   Related Links: EBS Patching Procedures - Chapter 2-13 - Registered Flagged Files

    Read the article

  • visual-studio-2008 versioninfo for all files updated from one place

    - by ravenspoint
    The version information, displayed when the mouse cursor hovers over the file in windows explorer, is set for a file built by visual studio in the VERSION resource. I would like to set the version in one place for all the files built by a solution, preferably when I change the version in the install properties. Is there a way to do this? The motivation for this is that if the version is not updated for a file, then the installer will leave previous versions of files instead of replacing them with new files. This happens even when the 'RemovePreviousVersions' property is set. In order to save the tedious and error prone task of updating the version in every file built and installed, I remove the version resource from all files - which is not elegant.

    Read the article

  • RESTful applications logic and cross resource operations

    - by Gaz_Edge
    I have an RESTful api that allows my users to receive enquiries about their business e.g. 'I would like to book service x on date y. Is this available?'. The api saves this information as a resource to the following URI users/{userId}/enquiries/{enquiryId} The information shown when this resource is retrieved are the standard sort of things you'd expect from an enquiry - email, first_name, last_name, address, message The api also allows customers to be created for a user. The customer has a login and password and also a profile. The following URIs expose these two resources PUT users/{userId}/customers/{customerId} PUT users/{userId}/customers/{customerId}/profile The problem I am having is that I would like to have the ability to allow users to create a customer from an enquiry. For example, the user is able to offer their service on the date requested and will then want to setup a customer with login details etc to allow them to manage the rest of the process. The obvious answer would be to use a URI like users/{userId}/enquiries/{enquiryId}/convert-to-client The problem with this is is that it somewhat goes against a lot of what I've been reading about how to implement REST (specifically from the book Restful Web Services which suggests that URIs should point to resources not operations on resources). The other option would be to get the client application (i.e. the code that calls the api) to handle some of this application logic. This doesn't quite feel right to me. I have implemented in my design that the client app is fairly dumb. It knows just enough to display the results from the API, and does not contain any application logic. Would be great to hear what others views are on the best way of setting this up Am I wrong to have no application logic in the client app? How would I perform this operation purely in the REST api?

    Read the article

  • Cluster Core Resource state of Exchange 2010 DAG

    - by Christoph
    I have two Exchange 2010 servers in a DAG and a witness server to implement mailbox resiliency. The two Exchange servers are in two subnets and the Windows failover cluster therefore has two IP address resources. I now that Exchange uses "core functionality" of Windows Server failover clustering, but it does not use all features. My setup also seems to work, but if I run the validation in the Windows Failover Cluster Manager, it complains about one of the IP address resources being offline. However, I cannot bring this resource online, because the server complains that "the specified cluster node is not the owner of the resource, or the node is not a possible owner of the resource". If I "Simulate failure of this resource", it becomes offline and the other IP becomes online. I have the vague idea that Exchange might use the state of the IP resource to identify the Primary Active Manager, but I am not sure. As it is obviously important that failover really works, I would like to be sure. Therefore, my question is: Is it normal that only one IP address resource in a Exchange 2010 DAG failover cluster is active at a time? If not, how do I bring both resources online at the same time given the error described above?

    Read the article

  • VMWare Server - Writing files to virtual hard drive performance

    - by Ardman
    We have just moved our infrastructure from physical servers to virtual machines. Everything is running great and we are happy with the result of the move. We have identified one problem, and that is reading/writing performance. We have an application that compiles files and writes to disk. This is considerably slower on the new virtual machines compared to the physical machines. Is there a performance bottleneck when writing to a virtual hard drive compared to a physical hard drive?

    Read the article

  • Where can I find WebSphere configuration files?

    - by Nicholas Key
    Hi there, I would like to know where are the WebSphere configuration details saved? Specifically, configuration details that are shown in the Administrative Console (from the web) or from the console using wsadmin. Some of the examples would be: Java and Process Management: Class loader, Process definition, Process execution Container Settings: Session management, SIP Container Settings, Web Container Settings, Portlet Container Settings Are there XML files that persist these configuration details? Nicholas

    Read the article

  • Apache - Serving static files from different subdomain + machine

    - by rubayeet
    Here's the scenario A site is running on this domain - www.someserver.com I'm going to host subdomain.someserver.com on my machine. Let's say all the image files are under the directory 'img'. I don't want to copy all their images to my machine. So what should be the Apache directive(s) that'll map the request for an image, like http://subdomain.someserver.com/img/image.png to http://www.someserver.com/img/image.png

    Read the article

  • VMWare - Writing files to virtual hard drive performance

    - by Ardman
    We have just moved our infrastructure from physical servers to virtual machines. Everything is running great and we are happy with the result of the move. We have identified one problem, and that is reading/writing performance. We have an application that compiles files and writes to disk. This is considerably slower on the new virtual machines compared to the physical machines. Is there a performance bottleneck when writing to a virtual hard drive compared to a physical hard drive?

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >