Search Results

Search found 30742 results on 1230 pages for 'folder size'.

Page 420/1230 | < Previous Page | 416 417 418 419 420 421 422 423 424 425 426 427  | Next Page >

  • Windows 7 Taskbar doesn't remember configuration

    - by eidylon
    Hi all... I have my taskbar on Win7 all customized. I have it at the top of the screen, and i have the address bar, desktop toolbar, and a custom folder toolbar all turned on. I have it set to two rows, and have the taskbar and desktop on one row and the address bar and custom folder on the second row. When i shutdown then restart my computer it remembers that i have the taskbar at the top of the screen, but it does not remember the way i had the various bars organized, and it just makes it a single row with the bars all mushed together on one row. Anyone else had this/know how to fix it? Running Win7 Ultimate x64.

    Read the article

  • How to avoid loop limitation in a openvz container?

    - by mat.viguier
    On a openVZ containing Deb7 I need to lock the maximum size of a folder, which is used to upload on a php based web server. The directory is synced, so I have to lock the maxsize. MAXSIZE should be upgradable by adding some physical disk later ... I want to use a file as a block device for a file system. So I have done : dd if=/dev/zero of=/disk2/filesystem.dat bs=1M count=100 Then, I made the filesystem on it mkfs.ext4 filesystem.dat Then I tried to mount it : mkdir /opt/filesystem ; mount /disk2/filesystem.dat /opt/filesystem My OpenVZ (it is on a VPS) has no loop module in the kernerl. So I got Could not find any loop device as usual under OpenVz So i think I have to use FUSE, but I really do not know HOW .... Any idea on locking the size of directory under OpenVZ ?

    Read the article

  • Samba share external USB device

    - by bioShark
    I managed to share stuff from my Ubuntu 12.04 to my private network, and the data is visible from a Windows machine. I even shared a hdd that has windows on it. So everything seems to work fine. When I want to share a mounted device (USB pen drive, USB HDD... etc) however, I get from the Windows machine: Access denied on file \... I realize that this is due to the missing rights on the mounted folder. By default a mounted folder gets the equivalent of 700 : drwx------, and the owner myself. But, I can't seem to change the rights on the external device... they remain 700. Is there a special trick I need to do in order to share NTFS mounted usb devices? Thanks P.S. from this Question I see that NTFS devices can not be shared....is this true? It's a bit strange, because I have in my PC 2 HDD's with 3 NTFS partitions, and I can share them without a problem.

    Read the article

  • Find command exclude files whose path match a certain pattern

    - by user40570
    I have a find command that looks for files that was modified recently and outputs the date find /path/on/server -mtime -1 -name '*.js' -exec ls -l {} \; I would like it to exclude any deeply nested folder that matches a certain pattern e.g. there are a number of folders that have a "statistics" directory and ".svn" directories. So i'd like to be able to say if the file that was modified yesterday is in a folder named statistics ignore it. Or perhaps not search for files in those folders at all.

    Read the article

  • MS Excel: Can I link images using a relative path?

    - by Port Islander 2009
    I am working on an MS Excel document that contains a lot of (around 200) images. They are currently saved within the document, so the file becomes huge and working gets very slow. Linking the pictures without saving them works very well - I now have the Excel document and a folder "pictures" next to it that contains all my image files. However, when I move the document and the folder to a new location, all my pictures disappear. This seems to be because Excel saves the link information as absolute paths. (Update: Actually, according to this thread, Excel stores the link information as relative paths as well. Now I really don't know why my links break down..) Is there a convenient way to save them as relative paths or have Excel automatically update the path information? Update: It's important that the images get displayed on the sheet and can be printed. I am working with Microsoft Excel for Mac 2008 and 2011. I really appreciate your help.

    Read the article

  • Is my graphics card in use or not?

    - by Lindhe94
    I have a Samsung Series 7 NP730U3E which is running Ubuntu Gnome 13.10. This computer have an Intel Core i5 3337U an AMD Radeon HD 8570M on the inside. Ubuntu 13.10 is said to have driver support for this graphics card, but I am not sure whether or not this is the case. When I check System Settings Details it says "Graphics: Intel® Ivybridge Mobile" and lspci | grep VGA returns VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09). But lshw -c video returns *-display description: Display controller product: Mars [Radeon HD 8730M] vendor: Advanced Micro Devices, Inc. [AMD/ATI] physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi bus_master cap_list rom configuration: driver=radeon latency=0 resources: irq:47 memory:e0000000-efffffff memory:f7e00000-f7e3ffff ioport:e000(size=256) memory:f7e40000-f7e5ffff *-display description: VGA compatible controller product: 3rd Gen Core processor Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:f7800000-f7bfffff memory:d0000000-dfffffff ioport:f000(size=64) What is the case? Is my graphics card is use, or do my laptop have undiscovered powers yet to yield?

    Read the article

  • How to automount a Truecrypt volume before login in Windows 7?

    - by nonoitall
    I have an external hard drive containing all my documents, and it is encrypted with a password via Truecrypt. I'd like my desktop computer at home to automatically mount the volume prior to my logging in (so that it can be used as my user folder) without asking me for a password. (Yes, the password can be saved in plain text on my desktop's hard drive - that's okay.) For the life of me, I can't figure out a way to do this that actually works though. Tried using the Task Scheduler to schedule a mount when the computer starts up, and it works, but the volume is only accessible by my user account after I log in. (Haven't tried every combination of users/options for the scheduled task, so maybe there's something else there I need to try.) Also tried adding a startup script for my user account that runs on login, which evidently is too late to set up the user's profile folder. Anybody ever successfully achieve this or something like it?

    Read the article

  • What are 'damaged files' on external hard drive (HFS format for OS X)?

    - by dtlussier
    I have an external HD formatted to default HFS (Mac OS Extended - Journaled) and very once and a while I get a folder called DamagedFiles in the root of the volume. The folder contains a collection of links to files on the drive. In general the files seem fine as I am for example able to open the images or text files without a problem. Is this serious? What can I do to fix this problem? Any advice would be great as I couldn't find anything on here or via Google that addressed this problem in particular. Many thanks.

    Read the article

  • Automatic subdomain creation in htaccess on Apache

    - by ANOther8660
    I have a domain in my HOSTS file; www.mytestbusiness.com However, I want to convert some folders into subdomains automatically, e.g. www.mytestbusiness.com/birmingham www.mytestbusiness.com/london which should be: www.birmingham.mytestbusiness.com www.london.mytestbusiness.com Only for some folders do I want to keep it as a domain/folder link, e.g. www.mytestbusiness.com/styles/ I don't want the CSS folder becoming a subdomain, or certain folders like cgi-bin, dwoo etc. (dwoo contains the site templates!) I am running Apache 2.2 on Windows 7 Home Edition, and the site has no issues, it's just creating subdomains in .htaccess without having to manually declare them which is the problem. What's the best way to do this, other than manually declaring them in httpd-vhosts.conf as I used to do? Thanks

    Read the article

  • ionice idle is ignored

    - by Ferran Basora
    I have been testing the ionice command for a while and the idle (3) mode seems to be ignored in most cases. My test is to run both command at the same time: du <big folder> ionice -c 3 du <another big folder> If I check both process in iotop I see no difference in the percentage of io utilization for each process. To provide more information about the CFQ scheduler I'm using a 3.5.0 linux kernel. I started doing this test because I'm experimenting a system lag each time a daily cron job updatedb.mlocate is executed in my Ubuntu 12.10 machine. If you check the /etc/cron.daily/mlocate file you realize that the command is executed like: /usr/bin/ionice -c3 /usr/bin/updatedb.mlocate Also, the funny thing is that whenever my system for some reason starts using swap memory, the updatedb.mlocate io process is been scheduled faster than kswapd0 process, and then my system gets stuck. Some suggestion? References: http://ubuntuforums.org/showthread.php?t=1243951&page=2 https://bugs.launchpad.net/ubuntu/+source/findutils/+bug/332790

    Read the article

  • How to convert a power point pdf to a pdf that is easy to read on kindle?

    - by SpaceTrucker
    I have several power point presentations as pdfs. I would like to read them on the original kindle in landscape format. When I read the original on the kindle then a single slide won't fit on the kindles display. I thought the easiest way to convert the pdf was to repring it with a pdf printer. However I don't know the paper size to use. I already tried using Calibre as suggested by this question. However the output is not usable because of formatting issues. So what paper size should I use for the pdf printer to reprint them in landscape format or are there any other tools I could use for that task?

    Read the article

  • Is 10% too much for autogrow on a 4 GB sql server DB?

    - by ntsue
    I am getting the following error: 2011-03-07 21:59:35.73 spid64 Autogrow of file 'MYDB_DATA' in database 'MYDB' was cancelled by user or timed out after 16078 milliseconds. Use ALTER DATABASE to set a smaller FILEGROWTH value for this file or to explicitly set a new file size. I did some research, and I found that for large databases you should set autogrow to a fixed size (MB), and not to a percentage. I feel like this database is not large and I may not be addressing the correct issue by changing this value. Does anyone have any opinions? Thank you! EDIT: I should have specified SQL Server 2008 RC2 running on Windows Server 2008

    Read the article

  • Backup Solr home

    - by user226188
    I'm new to Solr: I've successfully installed Tomcat and Solr 4.3.1 webapp, and two collections on a CentOS 6.4 machine. Now, my server is in production and I need to make backups of solr. So, I would like to know what is the best way to backup solr... For the moment I'm dooing: stop tomcat = tar of my solr home = start tomcat, but I've read that is not a good solution? Moreover, this implie to stop all the tomcat which have other webapps than solr. I've also heard that there is a script named "backup" in solr home bin's folder ? but my bin folder is empty :( I don't want to make an another slave server with replication, for me it's not a backup solution because my backup are supposed to be send to a bacula backup server all nights. There is no builtin solution that I can work around to make a script ? like a mysqldump for Mysql servers. Thanks for help !

    Read the article

  • Versioning files in Windows XP

    - by Mike Cole
    I would like to set up an archive folder in Windows XP that would allow me to drop several different versions of the same file, and have it store each version. I would envision this to work similar to the recycle bin, where you can drop the same file 10 times and it stores each version. Anybody know how I can do this? Thanks! Edit: Using a Version Control System is complete overkill for this situation. I may just write a script that appends a date/time stamp to the file when added to the folder.

    Read the article

  • How can I get command prompt to merge my files in name order?

    - by Anastasia
    I'm using the copy command in command prompt to merge all the files in a directory, for a number of directories. The problem is, I need to edit the first file in each directory before I merge. This means that when I put in the command "copy /b *.mp3 name.mp3", the joined file has part 2 at the start and part 1 at the end, presumably because it was created last. Is there a way of using the copy command so that the files merge in name order? Each folder has a different number of parts, anywhere from 2 to 1000 so I don't want to list each file with a "+" in between. Ideally, I'd like to find something to insert into the copy command I'm already using. Otherwise, is there a way of rearranging the files in a folder so that if you enter "DIR", part 1 shows up first even if it was edited last? I'm using Windows 7 by the way.

    Read the article

  • Scaling sprite velocity / co-ordinatesin Android

    - by user22241
    I'm trying to find the answer to a question that I've had for a long time, but am having trouble finding it! I hope someone can help :-) I'm trying to find information on how to scale sprite velocity / movement / co-ordinates. What I mean by this is how do I get a sprite to move at the same speed relative to the screen size / DPI so that it takes the same amount of real-time to get from one side of the screen to the other? All of the posts pertaining to sprite scaling that I can find on the various forums relate to the size of the sprite, but this part of it I'm OK with so far, it's just that when I move a sprite, it kind of gets there at different speed depending on the dpi / resolution of the device. I hope I'm making sense. This is the code I have so far, instead of using explicit amounts, like 1, I'm using something like the following: platSpeedFloat= (1 * (dpi/160)); //Use '1' so on an MDPI screen, the sprite will move by 1 physical pixel Then basically what I'm doing is something like this: (all varialble previously declared) platSpeedSave+=platSpeedFloat; //Add the platSpeedFloat value to the current platSpeedSave value platSpeed=(int) platSpeedSave; //Cast to int so it can be checked in the following statement if (platSpeed==platSpeedSave) //Check the casted int value to float value stored previoiusly {floorY=floorY-platSpeed; //If they match then change the Y value platSpeedSave=0;} //Reset Would be grateful if someone could assists - hope I'm making sense. The above doesn't seems to work the sprite moves 'faster' on lower DPI screens. Thanks

    Read the article

  • Tile sizes in 2D games

    - by Ephismen
    While developing a small game using tile-mapping method a question came to my mind: I would develop the game on Windows but wouldn't exclude adapting it to another platform. What size(in pixels) would you recommend using for creating the tiles of a tile-mapped game(ie: RPG) with the following requirements? Have an acceptable level of detail without having too many tiles. Having a decent map size. Allow adaptation of the game on a handheld(ie: PSP), smartphone or a computer without too much loss of detail or slowdowns. Allow more or less important zoom-in / zoom-out. Have a resolution of tile that permits either pixel-perfect collision or block-collision. Anything from a good explanation to a game example is useful as long as it can fit the requirements. This question may seem a bit simplistic, but I noticed that many Indies game developer were using inappropriate scales scenery. Also sorry for the poor syntax and the lack of vocabulary of my question, being a non-native English speaker doesn't help when talking about computers programming.

    Read the article

  • De-duplicate Firefox bookmarks

    - by Zoredache
    What methods exist to de-duplicate Firefox bookmarks. As I search Google I find that there previously was a plugin called CheckPlaces, but that no longer seems to exist. Another popular suggestion seems to be AM-DeadLink, which I tried, but it completely trashed my bookmarks. (Fortunately I had a backup first, and yes I had closed Firefox first as instructed). I was trying to move all my youtube.com bookmarks into a folder. I tried doing a search, and then dragging the bookmarks into the folder. Apparently this creates a copy, instead of moving them as I expected. So now I have 3 of everything since I had tried a couple times.

    Read the article

  • How to perform a nested mount when using chroot?

    - by user55542
    Note that this question is prompted by the circumstances detailed by me (as Xl1NntniNH7F) in http://www.linuxquestions.org/questions/linux-desktop-74/boot-failure-upon-updating-e2fsprogs-in-ubuntu-10-10-a-947328/. Thus if you could address the underlying cause of the boot failure, I would very much appreciate it. I'm trying to replicate the environment in my ubuntu installation (where the home folder is on a separate partition) in order to run make uninstall. I'm using a live cd. How to mount a dir in one partition (sda2, mounted in ubuntu as the home folder) into a directory on another mounted partition (sda3)? I did chroot /mnt/sda2 but I don't know how to mount sda3 to /home, and my various attempts didn't work. As I am unfamiliar with chroot, my approach could be wrong, so it would be great if you could suggest what I should do, given my circumstances.

    Read the article

  • 3d point cloud render from x,y,z 2d array with texture

    - by user1733628
    Need some direction on 3d point cloud display using OpenGL in c++ (vs2008). I am brand new to OpenGL and trying to do a 3d point cloud display with a texture. I have 3 2D arrays (each same size 1024x512) representing x,y,z of each point. I think I am on the right track with glBegin(GL_POLYGON); for(int i=0; i<1024; i++) { for(int j=0; j<512; j++) { glVertex3f(x[i][j], y[i][j], z[i][j]); } } glEnd(); Now this loads all the vertices in the buffer (I think) but from here I am not sure how to proceed. Or I am completely wrong here. Then I have another 2D array (same size) that contains color data (values from 0-255) that I want to use as texture on the 3D point cloud and display. I understand that this maybe a very basic OpenGL implementation for some but for me this is a huge learning curve. So any pointers, nudge or kick in the right direction will be appreciated.

    Read the article

  • Partition and mount my secondary hard drive on CentOS 5.5 64bit?

    - by Andrew Fashion
    I am trying to prepare my second hard drive for user image uploads. Here is the current layout: # sudo parted /dev/sda print Model: ATA WDC WD2500KS-00M (scsi) Disk /dev/sda: 250GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type File system Flags 1 32.3kB 107MB 107MB primary ext3 boot 2 107MB 8595MB 8488MB primary linux-swap 3 8595MB 10.7GB 2147MB primary ext3 4 10.7GB 250GB 239GB extended 5 10.7GB 250GB 239GB logical ext3 Information: Don't forget to update /etc/fstab, if necessary. I am assuming #4 is my secondary drive? How do I partition and mount it so I can begin using it? And how do I add to fstab? I understand if it's to many questions in one, just help me with whatever you can I guess :) Thank you for any help!

    Read the article

  • Poor TCP loopback throughput on Windows

    - by Yodan Tauber
    I measured the throughput of a locally bound TCP socket connection on my computer (Intel Q9550, 64 GB RAM, Windows XP 64 bit) using iperf. I got dissatisfying results (around 1.6 Gbit/s) each time, no matter how I tweaked the TCP settings (buffer length, window size, max segment size, no delay). I got similar results when I tried netperf. Now, I understand (from sources like these) that the average throughput of a loopback connection should be around 5 Gbit/s. What could be the reasons for such poor performance?

    Read the article

  • Manage Large E-Book Archive

    - by Cnkt
    I have a very large e-book archive (approx. 1TB) including various file formats eg. PDF, DJVU, MOBI and EPUB. I put them in different folders by subject eg. Engineering, Programming etc. But after many years, things are going crazy. The programming folder itself is 220GB and file names are cryptic. Some filenames are well defined like: 236659889_Final_Report_of_2012_Climate_Change_Conference.pdf but some filenames are just ISBN numbers or just download.pdf. I need an application for organizing and searching my e-books. I already tried Calibre, Mendeley and Debenu. But all these apps try to import files first and I dont have any spare 1+ TB for the apps import folder. Is there any good Windows application for just indexing filenames and contents of ebooks without importing them?

    Read the article

  • Need some critique on .NET/WCF SOA architecture plan

    - by user998101
    I am working on a refactoring of some services and would appreciate some critique on my general approach. I am working with three back-end data systems and need to expose an authenticated front-end API over http binding, JSON, and REST for internal apps as well as 3rd party integration. I've got a rough idea below that's a hybrid of what I have and where I intend to wind up. I intend to build guidance extensions to support this architecture so that devs can build this out quickly. Here's the current idea for our structure: Front-end WCF routing service (spread across multiple IIS servers via hardware load balancer) Load balancing of services behind routing is handled within routing service, probably round-robin One of the services will be a token Multiple bindings per-service exposed to address JSON, REST, and whatever else comes up later All in/out is handled via POCO DTOs Use unity to scan for what services are available and expose them The front-end services behind the routing service do nothing more than expose the API and do conversion of DTO<-Entity Unity inject service implementation to allow mocking automapper for DTO/Entity conversion Invoke WF services where response required immediately Queue to ESB for async WF -- ESB will invoke WF later Business logic WF layer Expose same api as front-end services Implement business logic Wrap transaction context where needed Call out to composite/atomic services Composite/Atomic Services Exposed as WCF One service per back-end system Standard atomic CRUD operations plus composite operations Supports transaction context The questions I have are: Are the separation of concerns outlined above beneficial? Current thought is each layer below is its own project, except the backend stuff, where each system gets one project. The project has a servicehost and all the services are under a services folder. Interfaces live in a separate project at each layer. DTO and Entities are in two separate projects under a shared folder. I am currently planning to build dedicated services for shared functionality such as logging and overload things like tracelistener to call those services. Is this a valid approach? Any other suggestions/comments?

    Read the article

  • Disk Cleanup on Server 2008 R2 is ineffective

    - by cparker4486
    I have a user profile with ~2.9GB of Error Reports backed up in the ReportQueue folder (C:\Users\UserName\AppData\Local\Microsoft\Windows\WER\ReportQueue). Running DiskCleanup as the administrator does not detect these files and therefore does not clean them up. However, running the utility as the user shows an even larger amount (12.4GB!) of error reporting files. As seen below: The problem is that after running the cleanup utility the disk spaced used does not decrease by anywhere near 12.4GB and running the utility again detects the same 12.4GB of files. What is the problem here? Alternatively, can I manually delete the files in the ReportQueue folder?

    Read the article

< Previous Page | 416 417 418 419 420 421 422 423 424 425 426 427  | Next Page >