Search Results

Search found 27142 results on 1086 pages for 'control structure'.

Page 547/1086 | < Previous Page | 543 544 545 546 547 548 549 550 551 552 553 554  | Next Page >

  • Ultimate way to use Picasa in a home network

    - by luisfarzati
    I've been trying a lot of approaches but still didn't find any effective solution. I want gigs of photos in a network drive (a IOMega Home Media Network Drive, plugged to my wifi router). I'd like to do 2 things: Do a Picasa import process of all the photos in the drive, making Picasa organize all the files in a year/month folder structure physically. Ideally, the import target directory should be the same network drive, otherwise I should move all the imported files in my local computer back to the drive myself. Share the Picasa database over the network, by uploading it to the network drive. Have me and other members of the family point our Picasas to the network database, and see the photos as well as make changes (tag faces, create logical albums, etc) into it. Is ANY possibility to accomplish this? Or should I be looking for another photo management app, and in that case do you know such one? Thank you!

    Read the article

  • Moving cpanel backup of magento site to VPS

    - by user2564024
    I was having my site in shared hosting, I took the entire backup, its structure is like addons homedir mysql resellerpackages suspendinfo bandwidth homedir_paths mysql.sql sds userconfig counters httpfiles mysql-timestamps sds2 userdata cp locale nobodyfiles shadow va cron logaholic pds shell vad digestshadow logs proftpdpasswd ssl version dnszones meta psql sslcerts vf domainkeys mm quota ssldomain fp mma resellerconfig sslkeys has_sslstorage mms resellerfeatures suspended Now I have subscribed to vps, I have copied the files inside homedir/public_html to var/www/html of my new hosting, but am seeing the following error when I view it browser, There has been an error processing your request Exception printing is disabled by default for security reasons. Error log record number: 259343920016 I have just created database with name magenhto inside mysql. Previously I had cpanel and used one click installer. Hence am not aware of how to use that data inside mysql to this new system and are there any more changes.

    Read the article

  • Access to a network server without port forwarding

    - by SdevDavid
    I have a network with the following structure. The server in PC2 is simple socket server TCP in 8080 port. I need to access to PC2 from other external network by socket client. This socket client knows the public IP (85.xxx.xxx.x), the private IP (192.168.0.21) and the port. How I can access PC2 without port forwarding on the router? If possible, I would like to have a reference in any programming language of this case.

    Read the article

  • Disaster - partitions lost, data seemingly alive, how to recover?

    - by a2h
    I've used TestDisk and it's written my old partition structure of a ~20GiB partition for Vista, ~25GiB partition for 7 (but it now shows up as unallocated) and a ~400GiB partition for documents. What it's meant to be is a 30GiB partition for 7, some unallocated space, and a ~400GiB partition for documents. So currently, I have access to all my documents, but not any of the programs I've installed on C:, or AppData, because my boot partition is now supposedly a 20GB vista partition. I've tried using my Windows 7 install disc's repair function, but that did nothing beyond wasting about 10 minutes of my time. I'm currently posting from an Ubuntu live CD. Any help?

    Read the article

  • Access denied to EFS encrypted files after PC joins domain

    - by mjmarsh
    I'm experiencing strange behavior with Windows Encrypted File System: I have a machine that is in workgroup mode (not joined to a domain) I encrypt an entire directory structure on the machine (basically a folder and subfolders with data files for my application). My application writes and reads files from the encrypted file hierarchy as a local Windows user (let's call the account 'SecureUser'). This works fine I then join the PC to a domain (Let's call it 'TEST') Afterwards, processes running as the local 'SecureUser' account can't read the files it wrote originally when it was off the domain (What is also strange is that the files are listed as "read only" now and I cannot unset this flag via Windows Explorer or the command line, even though it looks like it succeeds) I then 'un-join' the PC from the domain and everything works again Is there something about changing domain membership on a PC that changes the behavior of EFS so that previously encrypted files cannot be read, even by the originating user? Thanks in advance

    Read the article

  • Why does IIS 7 return a 500 when I access an HTML page?

    - by Out Into Space
    IIS 7 returns a 500 server error when I request an HTML page with this structure: <html> <head> <title>Test Page</title> </head> <body> Some text </body> </html> It works just fine the first time I access it, but subsequent attempts cause the error. If I remove the HTML tags, the error doesn't occur: <body> Some text </body> It seems very odd that the presence of the HTML tag would cause it to blow up. Any ideas?

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Is there a way to create a copy-on-write copy of a directory?

    - by BCS
    I'm thinking of a situation where I would have something that creates a copy of a directory, tweaks a few files, and then does some processing on the result. This wold be done fairly often, maybe a few dozen times a day. (The exact use case is testing patch submissions; dupe the code, patch it, build/test/report/etc.) What I'm looking for could be done by creating a new directory structure and populating it with hard links from the origonal. However this only works if all the tools you use delete and recreate files rather than edit them in place. Is there a way to have the file system do copy-on-write for a file? Note: I'm aware that many FSs use COW at a block level (all updates are done via writes to new blocks) but this is not what I want.

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Turning a running Linux system into a KVM instance on another machine

    - by Charles
    I have two physical machines that I wish to virtualize. I can not (physically) plug the hard drives from either machine into the new machine that will act as their VM host, so I think that copying the entire structure of the system over using dd is out of the question. How can I best go about migrating these machines from their hardware to the KVM environment? I've set up empty, unformatted LVM logical volumes to host their filesystems, with the understanding that giving the VMs a real partition to work with achieves higher performance than sticking an image on the filesystem. Would I be better off creating new OS installs and rsyncing the differences over? FWIW, the two machines to be VM'd are running CentOS 5, and the host machine is running Ubuntu Server 10.04 for no particularly important reason. I doubt this matters too much, as it's still going to be KVM and libvert that matter.

    Read the article

  • Regex working in RedHat is not giving any result in Ubuntu

    - by Supratik
    My goal is to match specific files from specific sub directories. I have the following folder structure `-- data |-- a |-- a.txt |-- b |-- b.txt |-- c |-- c.txt |-- d |-- d.txt |-- e |-- e.txt |-- org-1 | |-- a.org | |-- b.org | |-- org.txt | |-- user-0 | | |-- a.txt | | |-- b.txt I am trying to list the files only inside the data directory. I am able to get the correct result using the following command in RHEL find ./testdir/ -iwholename "*/data/[!/].txt" a.txt b.txt c.txt d.txt e.txt If I run the same command in Ubuntu it is not working. Can anyone please tell me why it is not working in Ubuntu ?

    Read the article

  • creating a subdomain on windows server

    - by jason
    Hi I'm trying to set up a sub domain for development on a windows server and am having problems setting the correct details in the httpd.ini file and hoped soemone could help. I have set up the subdomain http:// dev. website .com The files that I want to use for this subdomain are on the server in a folder called development http:// www. website .com /development in the directory structure they are in /htdocs/development What do I need to add the the httpd.ini file to point the http:// dev .website .com to file files located in the /htdocs/development folder on the server? Many Thanks

    Read the article

  • Is there a "pattern" or a group that defines *rcs files in *nix environments?

    - by Somebody still uses you MS-DOS
    I'm starting to use command line a little more, and I see there are a lot of ways to configure some config files in my $HOME. This is good, since you can customize it the way you really like. Unfortunately, for begginners, having too many options is a little confusing. For example, I created .bash_alias for some alias I'm using. I didn't even know this option existed, I'm used to simply edit .bashrc. Do exist a pattern, a "good practice", envisioning flexibility and modularity in terms of rc files structure? Do exist a standardization group for this, or every body just creates it's own configuration setup?

    Read the article

  • Everything You Ever Wanted to Know about Mod_Rewrite Rules but Were Afraid to Ask?

    - by Kyle Brandt
    How can I become an expert at writing mod_rewrite rules? What is the fundamental format and structure of mod_rewrite rules? What form/flavor of regular expressions do I need to have a solid grasp of? What are the most common mistakes/pitfalls when writing rewrite rules? What is a good method for testing and verifying mod_rewrite rules? Are there SEO or performance implications of mod_rewrite rules I should be aware of? Are there common situations where mod_rewrite might seem like the right tool for the job but isn't? What are some common examples?

    Read the article

  • Shortcut with arguments in Debian

    - by Duncan
    I have a volume on a debian server which contains a large number of images at full resolution in various folders. What I'd like to do is have a separate sort of browse proxy folder which contains lower quality browse copies of these to enable users to access them for viewing over lower speed dial in accounts. I'd ideally like these to be created on the fly using ImageMagick so there isnt the need to store the large number of browse copies full time and worry about keeping them up to date etc The way I'd invisaged this happening is the browse proxy folder containing a duplicate file and folder structure but with symlinks pointing to a script to transform them with the file path as an argument. Except I know this isnt possible with symlinks so am wondering if there's another way of doing this on linux. On windows shortcuts can take arguments and I'm wondering how to do the same on a Linux platform? (or perhaps I'm going about this the wrong way?)

    Read the article

  • What are the pros & cons of these MySQL engines for OLTP -- XtraDB, PBXT, or TokuDB?

    - by Continuation
    I'm working on a social website with an approximate read/write split of 90/10. Trying to decide on a MySQL engine. The ones I'm interested in are: XtraDB PBXT TokuDB What are the pros and cons of them for my use case? A few specific questions: PBXT uses log-based structure that avoids double-writes. It sounds very elegant, but the benchmark I've seen doesn't show any/much advantages over XtraDB. Do you have any experience with PBXT/XtraDB you can share? TokuDB sounds VERY interesting. But all the benchmarks I've seen are about single-threaded bulk inserts - inserting 100M rows for example. that's not very relevant for OLTP. What about its performance with large number of concurrent threads writing and reading at the same time? Anyone has tried that?

    Read the article

  • How does Linux's unlink on a NTFS filesystem differs from Window's own implementation?

    - by DavideRossi
    I have an external USB disk with an NTFS filesystem on it. If I remove a file from Windows and I run one of the several "undelete" utilities (say, TestDisk) I can easily recover the file (because "it's still there but it's marked as deleted"). If I remove the file from Linux (I'm using Ubuntu) no utility can recover the file (unless I use a deep-search signature-based one). Why? It looks like Linux does not just "mark it as deleted" but it wipes away some on-disk structure, is this the case?

    Read the article

  • Sharing a symlinked (`mklink /d`) directory via SMB?

    - by Alois Mahdal
    I have a Windows 7 amd64 box where one directory is shared: local path is d:\drop\ remote path is \\aloism\drop from SMB point of view, Everyone has Read and Write permission ACLs for the folder are set so that all authenticated users have read and write permissions:NT AUTHORITY\Authenticated Users:(OI)(CI)C (which is inherited to all levels below) Now I create a symbolic link within the structure of the directory: D:\drop>mklink /d tools2 tools symbolic link created for tools2 <<===>> tools The problem is that I can't access this new directory from any of the remote machines (a Windows 7 box and a Windows XP box—both behave the same way): C:\>dir \\aloism\drop\tools2\ Volume in drive \\aloism\drop is droot Volume Serial Number is FA73-1897 Directory of \\aloism\drop\tools2 File Not Found How can I make it work? Possibly also for files?

    Read the article

  • Unix / linux permissions setup for shared hosting with Apache

    - by weiyin
    I'm in the process of setting up a server from a clean CentOS 5 install. What is the best permission structure (users, groups, unix permissions) for running a single instance of apache for multiple users? Ideally, it should satisfy these requirements: Each user's websites are stored in a subdirectory of their home directory. Users can edit files and permissions. Apache can read the websites of all users. No user can read the website files of other users. Bonus question: how to add PHP and/or Perl and/or Ruby to Apache without allowing any users to access any other user's files?

    Read the article

  • IIS6 host multiple websites under same sub-domain (or something similar)

    - by Sigurbjörn
    Hi there! I'm trying to figure out a structure for a hosted application that i'm working on. I've got a domain lets call it app.company.com (a sub-domain company.com of course) that is setup to redirect to my IIS 6 web server. I would like to set up one website in IIS for each client that will use this application. And have the URL schema be like this: app.company.com/clientA -- would point to ClientA website in IIS app.company.com/clientB -- would point to ClientB website in IIS Do you guys have any pointers or best practices for my scenario?

    Read the article

  • Imap subfolders not shown in list command - perhaps acls misconfiguration

    - by mschenk74
    My goal is to copy the whole folder structure with all mails from one imap account to another. The tool I am using for this is imapcopy (the java based version from code.google.com since the unix/linux tool packaged with debian doesn't support imaps). Now, there is one problem: The tool only copies the top-level folders and not the nested ones. To narrow down the problem I have downloaded the source code of imapcopy and debugged into the code. There I noticed that the folder.list() (which is mapped to the list "%" imap command) returns an empty list. But when I do a getFolder(<subfoldername>) I can access those subfolders. After reading some documentation about the features of imap I think that the problem might be some misconfigured ACLs which prohibit the listing of those folders but allow to read ad write to them. How should I check this ACLs? which tools do I need for this task?

    Read the article

  • Data storage solutions for rapidly running out of space

    - by Grimlockz
    I have 2 web servers (1 live and other backup), the issue I have is our storage is rapidly running out. All the data on the server is used by our customers and new documents are uploaded to the server daily. So nothing can be deleted as it's always in use. We use a flat file structure with no database. I'm seeking solutions or ideas for the best place to move the our data to. The data has to be secure and needs to run on a linux environment. Not sure where to start - clusters, vmware, or they such solutions for huge file servers?

    Read the article

  • Amazon S3 - Storage Class and Server Side Encryption

    - by Steven
    Ahhh! I am using Amazon S3 for some low price storage to clear down out SAN. I created the bucket and created a root folder. I set the storage class to standard and server side encryption AES. I started a copy job to move the files, some files copied over and i checked the files: Reduced Redundancy Encryption set to none WTF? So i deleted all files and folders. I manuallyed created the folder structure and again set the storage class and encryption level. I coped some files and bamm, still showing (at a file level as Reduced and no encryption). So my question is this, is it really raid'd and encrypted just not showing it properly (as the root folder is, how can the file not be??) or (b) am i being a huge tool and missing something?

    Read the article

  • Everything You Ever Wanted to Know about Mod_Rewrite Rules but Were Afraid to Ask?

    - by Kyle Brandt
    How can I become an expert at writing mod_rewrite rules? What is the fundamental format and structure of mod_rewrite rules? What form/flavor of regular expressions do I need to have a solid grasp of? What are the most common mistakes/pitfalls when writing rewrite rules? What is a good method for testing and verifying mod_rewrite rules? Are there SEO or performance implications of mod_rewrite rules I should be aware of? Are there common situations where mod_rewrite might seem like the right tool for the job but isn't? What are some common examples?

    Read the article

< Previous Page | 543 544 545 546 547 548 549 550 551 552 553 554  | Next Page >