Search Results

Search found 21923 results on 877 pages for 'software companies'.

Page 410/877 | < Previous Page | 406 407 408 409 410 411 412 413 414 415 416 417  | Next Page >

  • Client can't reach my production webserver. It's their ISP's fault, but now what?

    - by MikeN
    I have a customer in Michigan who can't access my production SaaS webserver that is hosted on Slicehost. All other companies across the US/Canada/Europe have no problem reaching the site. This problem is occuring intermittantly, and Slicehost customer service says it's a problem with the client's ISP. I got the IP address of my client, and ping'ing that IP address from my PROD server fails, but ping'ing the IP address from my dev box or our seperate blog server (also hosted on slicehost) works. How do I debug a problem like this? I asked the client to reach out to their local ISP and ask about this problem. A traceroute shows that the packets are getting stopped on a Comcast Michigan node which is the client's ISP. Is there anything I can do additionally to fix this problem for my client?

    Read the article

  • Generic/Text Printer on Windows 7 not prompting for file name

    - by Trevor Tippins
    Hope someone can shed some light on this. I am downloading reports from an AIX-based system by directing them to a TT printer which the terminal emulator (MultiView 2000) intercepts and directs to the default printer on the local system. This local printer is configured as a vanilla Generic/Text printer attached to a FILE port. When I print from AIX, the output is spooled down and the local printer prompts for a file name into which to save the file...but not under Windows 7. This has worked fine for many years, on both Win2K and WinXP. However, on Windows 7 the output gets spooled as a file into spool\PRINTERS (and looks as expected) but the print job then hangs with a status of "Error - Printing" and never prompts for a file name. I have to cancel the job. The Generic/Text printer works as expected with other applications. I have tried setting the printer to print directly rather than spooling but this only serves to hang the terminal session too. I've also tried to run the emulator in Windows 2000 Compatibility Mode and as Administrator in case it was something like that but with no luck. As you might expect, it does work fine in XP Mode (as long as I print to a printer defined therein and not the host's printer) but operationally this isn't going to be an option. Obviously this emulation software is a decade old (at least) and I could just cross/upgrade all the users (at a cost) but, before I do so, has anyone seen this sort of behaviour before and found some sort of fix? Remote OS: AIX 5 Client OS: Windows 7 Pro (32-bit) Printer: Generic/Text on a FILE port TE Software: MultiView 2000 (32-bit) Thanks in advance.

    Read the article

  • TCP/IP performance tuning under KVM/Qemu

    - by vpetersson
    With more and more companies switching to public cloud services, I'm curious what you guys' thoughts are on TCP/IP tuning in the cloud. Is it worth bothering with? Given that you don't have access to the host-server, you're somewhat limited I presume Let's say for the sake of the argument that you're running three MongoDB-servers in a replica-set on FreeBSD or Linux that all sync over an internal network. I'd also be curious if anyone made any actual performance benchmarks to back up their arguments. I benchmarked the various network drivers available for KVM/Qemu here, but I'm curious what the gurus here suggest to tune further. I started playing around a bit with the tuning-recommendations as suggested over here, but interestingly enough I saw a decrease in performance, rather than an increase, but perhaps I didn't fully understand the tweaks. Update: I did a few more benchmarks and posted the result here. Unfortunately the result wasn't really what I expected.

    Read the article

  • Turn computer into DAS (Direct Attached Storage)

    - by Damon
    Can we build a direct attached storage by taking a computer/server, adding an HBA, and installing some appropriate software? We would use Debian as a host OS for both the DAS and the server. If so, what software do we use? And do we simply need a HBA for the DAS and the Server? Or do we need more hardware? The goal is to use an older server that does not have enough room for drives but does have ECC memory, server processors, redundant power supplies, dual nics, etc. Then find any boxes, server or not, the key being having enough room for 8-12 drives, fans, etc. and turning them into a DAS; build two of these DAS's and have them connected to the server. Eventually we want to have two servers using DRBD and associated services like heartbeat and pace maker to create an HA setup for our server(s) but that will take a long time to configure since I have no experience with anything related to DRBD (yet) and have a learning curve I have to get past, not to mention the additional cost of more hardware (two servers vs one).

    Read the article

  • TV network capabilities.

    - by Narcolapser
    Question: Can the major TV producer's internet TV systems access network resource? Info: I'm looking for large TV's right now for my company to set up in conference rooms. We want the ability to load presentations with out having to have a computer to do so. Our hope is to put things on to network drives and access and display them from there. I've heard that LG's can do this if you convert the power point file in to a show format. that's fine. I just need to get this information to the TV with out the computer attached. Can anyone tell me if companies like LG, Vizio, Sony, Samsung, etc. have TV's that are capable of doing this? Thanks ~n

    Read the article

  • Is Ubuntu a viable replacement of Windows XP for small enterprise environments?

    - by Alex. S.
    Hi all, I'm a newbie systems administrator, so any advice would be great. I would like to setup ubuntu 8.04 lts in a small office of consulting in management (around 50 workstations) instead of Windows XP. I would install MS Office 2007 via WINE (*). It would be a fresh installation, so the migration would be less of a pain. The new setup would also include a small server as document repository and a backup server by now. Later, I would install another goodies like a IM server, a document management solution, and whatnot collaborative tool. What do you advice in this scenario? Do you think is viable? Should I try to convince my managers this is a good idea? I consider myself as a fair experienced user in both systems, and I'm the only guy in charge of everything. I need to cut costs down, and I think that antivirus and antimalware software are a waste of money and time. Is this good idea?, or should I resign and try to lock down the Windows systems and install AV software? Is there anything else in this setup I'm not foreseeing? (*) The only catch in my test machine until now had been that Office SmartArt doesn't work properly, the rest of Office 2007 may seem ok.

    Read the article

  • Howto enable SMPTS (465) postfix CentOS

    - by user197284
    I need help is enabling SMTPS. I use postfix , dovecot with MySQL(virtual domains). I do not know how to enable SMTPS(465). I already added tls related settings and key and certificate in the "/etc/postfix/main.cf" OS: Centos 6.4 64 bit Please my /etc/postfix/master.cf file here # ========================================================================== # service type private unpriv chroot wakeup maxproc command + args # (yes) (yes) (yes) (never) (100) # ========================================================================== smtp inet n - n - - smtpd -o content_filter=smtp-amavis:127.0.0.1:10024 -o receive_override_options=no_address_mappings pickup fifo n - n 60 1 pickup -o content_filter= -o receive_override_options=no_header_body_checks cleanup unix n - n - 0 cleanup qmgr fifo n - n 300 1 qmgr #qmgr fifo n - n 300 1 oqmgr tlsmgr unix - - n 1000? 1 tlsmgr rewrite unix - - n - - trivial-rewrite bounce unix - - n - 0 bounce defer unix - - n - 0 bounce trace unix - - n - 0 bounce verify unix - - n - 1 verify flush unix n - n 1000? 0 flush proxymap unix - - n - - proxymap smtp unix - - n - - smtp # When relaying mail as backup MX, disable fallback_relay to avoid MX loops relay unix - - n - - smtp -o fallback_relay= # -o smtp_helo_timeout=5 -o smtp_connect_timeout=5 showq unix n - n - - showq error unix - - n - - error discard unix - - n - - discard local unix - n n - - local virtual unix - n n - - virtual lmtp unix - - n - - lmtp anvil unix - - n - 1 anvil scache unix - - n - 1 scache # # ==================================================================== # Interfaces to non-Postfix software. Be sure to examine the manual # pages of the non-Postfix software to find out what options it wants. # ==================================================================== maildrop unix - n n - - pipe flags=DRhu user=vmail argv=/usr/local/bin/maildrop -d ${recipient} uucp unix - n n - - pipe flags=Fqhu user=uucp argv=uux -r -n -z -a$sender - $nexthop!rmail ($recipient) ifmail unix - n n - - pipe flags=F user=ftn argv=/usr/lib/ifmail/ifmail -r $nexthop ($recipient) bsmtp unix - n n - - pipe flags=Fq. user=foo argv=/usr/local/sbin/bsmtp -f $sender $nexthop $recipient # # spam/virus section # smtp-amavis unix - - y - 2 smtp -o smtp_data_done_timeout=1200 -o disable_dns_lookups=yes -o smtp_send_xforward_command=yes 127.0.0.1:10025 inet n - y - - smtpd -o content_filter= -o smtpd_helo_restrictions= -o smtpd_sender_restrictions= -o smtpd_recipient_restrictions=permit_mynetworks,reject -o mynetworks=127.0.0.0/8 -o smtpd_error_sleep_time=0 -o smtpd_soft_error_limit=1001 -o smtpd_hard_error_limit=1000 -o receive_override_options=no_header_body_checks -o smtpd_bind_address=127.0.0.1 -o smtpd_helo_required=no -o smtpd_client_restrictions= -o smtpd_restriction_classes= -o disable_vrfy_command=no -o strict_rfc821_envelopes=yes # # Dovecot LDA dovecot unix - n n - - pipe flags=DRhu user=vmail:mail argv=/usr/libexec/dovecot/deliver -d ${recipient} # # Vacation mail vacation unix - n n - - pipe flags=Rq user=vacation argv=/var/spool/vacation/vacation.pl -f ${sender} -- ${recipient} retry unix - - n - - error proxywrite unix - - n - 1 proxymap Please help to enable SMTPS. I have amavis enabled

    Read the article

  • domain user disabling screensaver

    - by RASG
    I have the following situation: Due to security reasons the screensaver is activated after 10 minutes, and immediately locks the screen. There are GPOs preventing the user from changing the screensaver parameters and the background image. In order to bypass the background policy, some users are using bginfo The problem is that for some reason now the screensaver doesn't work anymore. The settings are still the same (10 minutes; locked to the user) and comparing snapshots of the registry before and after executing bginfo doesn't show any significant modification. Any hints? EDIT 1: Ok, i figured whats going on, but now i have another question. bginfo refreshes the user settings by reading HKEY_CURRENT_USER\Control Panel\Desktop, which has ScreenSaveActive. If the user set it to 0, disables the screensaver. Why isnt HKEY_CURRENT_USER\Software\Policies\Microsoft\Windows\Control Panel\Desktop, which sets ScreenSaveActive to 1, being enforced? or if it is being enforced, where is bginfo storing the value 0, and how can it bypass the policy? EDIT 2: I also discovered that after setting any value to HKEY_CURRENT_USER\Control Panel\Desktop\ScreenSaveActive, it can be deleted and the last value will remain active. For some reason HKEY_CURRENT_USER\Software\Policies\Microsoft\Windows\Control Panel\Desktop\ScreenSaveActive value is not being enforced to the user.

    Read the article

  • Copying windows home server backup offsite

    - by Simon
    What ways are there to copy a windows home server backup to an offsite location? I'm talking specifically (and only) about the automated backup of my entire machine, and not the shared network folders. I am 90% working away from home on my laptop which has a 640GB drive so the shared folders are essentially useless to me. I backup every night, but if my house burns down or broken into the I'm in serious serious trouble ! I'm really looking for some alternative way to back up my entire machine - which much not interfere with the reliability or speed by which my WHS backs up my laptop every night. Either a way to 'export' a complete machine backup from the server, or recommendations on non-conflicting software I can backup to a 1TB drive at work are what I'm looking for. Note: I believe that WHS uses its own completely proprietary backup and doesn't use things like any 'backup bit' or 'archive bit'. I just dont want to install some other backup software that will conflict. PS I'm now running Windows 7 and just realized that I should probably check out the backup functionality it gives me. I assume that won't conflict right! Edit: Thanks for the hosted solutions. I'd also appreciate ways to backup to an 'offsite' location that I control - like my office vs. my home. The hosted solutions I think will be too slow or expensive for my needs.

    Read the article

  • How do I configure VMware View location-based printing to use Active Directory Groups?

    - by Jason Pearce
    I am attempting to configure VMware View 4.5's Location-Based Printing, which leverages an included OEM version of ThinPrint, to assign printers to active directory groups. The location-based printing feature maps printers that are physically near client systems to VMware View desktops. I am using the Active Directory group policy setting AutoConnect Location-based Printing for VMware View, which is located in the Microsoft Group Policy Object Editor in the Software Settings folder under Computer Configuration. The AutoConnect Location-based Printing for VMware View appearst to be just a name translation table. It permits me to assign a specific printer or printers to an IP Range, Client Name, Mac Address, User, or User Group. I'm attempting to assign printers to active directory user groups. I have created a new active directory group for each printer that I intend to use in VMware View desktop pools. I will then assign active directory users to the active directory groups that represent each network printer. Example: doej is a member of the PTR-FLOOR2-NORTH-ROOM255 active directory group. Using AutoConnect, I assigned the group to receive a network printer by adding PTR-FLOOR2-NORTH-ROOM255 in the User/Group column. Problem: When doej logs in to his VDI session, the printer is not present. However, if I use a wildcard "*" in the User/Group column instead of the specific PTR-FLOOR2-NORTH-ROOM255 active directory group, the printer is present and functions as designed. Alternatives: I have tried assigning printers to active directory groups within AutoConnect in the following ways, all unsuccesfull: PTR-FLOOR2-NORTH-ROOM255 domainexample\PTR-FLOOR2-NORTH-ROOM255 domainexample.local\PTR-FLOOR2-NORTH-ROOM255 Confirmation: The information used to map the printer to the VMware View desktop is stored in a registry entry on the View desktop in HKEY_LOCAL_MACHINE\SOFTWARE\Policies\thinprint\tpautoconnect. For each of these examples, I have reviewed the registry entry and can confirm that the desktop is receiving the information from the AutoConnect translation table. Summary: Can anyone provide an example of how to configure VMware View 4.5's Location-Based Printing so that I may assign network printers to active directory groups via the included AutoConnect tool? I would welcome a clear example of a working configuration. Thank you.

    Read the article

  • Best way to build / implement a corporate developer Linux distro with multiple kernels?

    - by Garen
    At work we have Linux users who understandably prefer using Ubuntu. Problem is, we also have developer tools that only work with 'officially' supported Linux distributions that use much older 2.6.18 based kernels. (And even if they worked with newer ones, the vendors could always say they won't "support" the software unless it's on one of their 'officially' supported platforms.) We could of course just tell them to use CentOS or something else 2.6.18-based, and I'm sure their response would be something like: "you can take Ubuntu from our cold, dead hands." :) Which brings to me some questions--is there any good/easy/recommended way to run something like Ubuntu as a host VM and Centos 5.x as a guest OS (with which system--Xen,KVM,VMWare, ...?), and then roll that into our own custom internal distribution that could be easily installed? KVM looks like a good high-performance option just recently included in RHEL 5.4, but if hardware support for virtualization like Intel-VT or AMD-V is necessary, then I'd guess only those folks with fairly new PCs will be able to do it. Would be very interested to hear how anyone else has addressed this kind issue. EDIT: The target audience / users of this kind of system would be developers, each one needs to run locally licensed commercial software, so building out some separate beefy central machines isn't an option unfortunately due to license restrictions. Even if that weren't the case, a couple developers could quickly eat up the resources with parallel builds. :) Ideally, I was hoping there was some step-by-step guide out there to build your own pre-built distribution that had e.g. CentOS 5.x and Ubuntu Desktop as a guest.

    Read the article

  • Things to check for an internet-facing email server.

    - by Shtééf
    I'm faced with the task of setting up a public-internet-facing email server, that will be relaying mail for all of our other servers in the network. While the software in itself is set up in few keystrokes, what little experience I have with managing an email server has thought me that there are tons of awkward filtering techniques employed by other email systems. Systems that my own server will inevitably interact with a some point. Hence, my questions: What things should be kept in mind and double checked when setting up an email server? What resources are available for checking if my email server is set-up correctly? I'm specifically NOT looking for instructions for any given mail server, such as Exchange or Postfix. But it's okay to say: “you should have X and Y in your set-up, because when talking to server software Z, it typically tries to weed out open relays by checking for these.” Some things I've discovered myself: Make sure forward and reverse DNS are set up. Mail servers tend to do a reverse lookup for the peer IP-address when receiving. Matching a reverse look up with a follow-up forward lookup is probably employed to weed out open relays run through malware on home networks. Make sure the user in the From-address exists. The From-address is easily spoofed. A receiving mail server may try to contact the mail server in the From-domain, and see if the From-user actually exists.

    Read the article

  • ConfigMgr 2012 - How to automatically make updates available to computers without forcing them to be installed?

    - by Massimo
    I'm using System Center Configuration Manager 2012 with the Software Update Point feature; however, in this environment patching has to be strictly manual, because server reboots need to be approved and scheduled by different people; thus, I need to use ConfigMgr's SUP like I would use a plain WSUS server with auto-approval but with manual installation. I created some Automatic Deployment Rules to automatically download and deploy critical updates, and to have an installation dealine of "as soon as possible"; but then, I've also configured those rules to not do anything when the deadline is reached, and to not perform system restarts even if needed (see image). Also, I've configured the device collection to where those rules deploy updates to not have any valid maintencance window. However, I'm experiencing quite the opposite as what I was expecting: as soon as the new updates are processed by the ADRs, they get automatically installed on all systems by the Software Center, and the computers are subsequently restarted. Why is this happening? Am I getting something wrong or is just ConfigMgr 2012 not behaving like it should?

    Read the article

  • Syncing Google Desktop Scratch Pad

    - by Anders Frey
    I'm a long time user of Google Desktop Scratch Pad and I would like to be able to put the note in the cloud and make it accessible from all my electronic units. I'm working towards changing the filepath Scratch Pad uses to retrieve the .txt to lead to a DropBox folder. As the Desktop Scratch Pad is discontinued I've had no luck in retrieving the API, but what I've got so far is this: The scratch pad data is located at: C:\Users[user]\AppData\Local\Google\Google Desktop\a3d83d5fa2e9\scratchpad.txt The registry keys related to Google Desktop is located at: HKEY_CURRENT_USER\Software\Google\Google Desktop I'm guessing the Scratch Pad app itself is located at: HKEY_CURRENT_USER\Software\Google\Google Desktop\Components I have limited experience with the registry, so I'm not able to translate the binary and hexadecimals, but I'm hoping that the path location is in there somewhere. I've tried using a bunch of other noteapps (including the 'new' scratch pad in chrome) but haven't been able to find one that suits my needs as Desktop Scratch Pad. Hence the effort in this matter. I may be way off and I'm not sure if this is possible to do, but I'm looking forward to hearing your thoughts.

    Read the article

  • HTTP cache for my virtual machines

    - by MathematicalOrchid
    I have several Linux virtual machines running on my home PC. One of the quirks of Linux is that every time you run a package manager, it wants to "refresh" the configured software repositories - which basically means it wants to download a file from the Internet. If I revert to an earlier snapshot of the VM, then next time I run the package manager it will re-download the exact same data again [since it no longer exists in the VM]. It seems a shame to waste bandwidth endlessly downloading the same data over and over again, so I was wondering if there's some way I can set up some kind of HTTP proxy server that caches downloaded files. I have no idea how you would do such a thing though. In particular, it needs to be set up so that the VMs don't need to "know" that the cache is there; it needs to be transparent. But I don't know how to do that. Any suggestions on what software I'd need to use? It would be nice if I could run it under the Windows host OS, but running a small VM with a Linux guest is also possible...

    Read the article

  • Photoshop CS5 performance over network drive (cifs)

    - by grub
    Hello Everyone I did install a QNAP NAS TS410 for a customer (professional photographer) with 3 Hitachi Deskstar 7200rpm 2TB disk configured as RAID5. The NAS and the workstations are connected over a Gigabit network. He and his co-worker are accessing the photos (about 1TB of photos) over a mapped network drive from their windows machines (Windows XP - 32bit and Windows 7 Ultimate - 32bit). Both are using Photoshop CS5 to edit the photos. The problem is that to save a edited photo takes a really long time, it takes about 3 times as long to save a photo as to open it. After some tests I can exclude the network, the NAS and the windows machines as source of the issue. I think the problem is the Photoshop software and its handling of the network drives. Officially network drives are not supported by Adobe. I do not have any experience with the Adobe products, especially with Adobe Photoshop CS5. What are your recommendation to solve the performance issue? Should my customer copy the photos to the local drive, edit them and upload them again to the network drive or is Adobe Drive or Adobe Version Cue the answer? One requirement is that the photos need to be accessible / editable from both computers even when one of them is offline. Adobe Version Cue needs a dedicated service running to be usable, so this solution is not possible as far as I understand the Cue software. Thank you for your input to this issue and have a nice day :-) Greetings grub

    Read the article

  • Can I make a computer connecting via VPN visible to computers within the network it is connecting to

    - by SCdF
    OK, here's the deal: I have a computer (specifically, a MacBook Pro) that is connected to a standard network that is then connected to the big nasty internet. Let's call it foo. It runs a web server on 8084, and so if you were on its local network you could get to this with http://foo:8084/, or http://192.168.1.2:8084/, or whatever. From foo I can VPN into my companies intranet and see a computer on the local company network called bar (another MacBook Pro, incidentally). Is there any way to set this up so that while foo is on the VPN bar can access http://foo:8084/ (or http://x.x.x.x:8084/, or whatever)? (From my limited understanding of how VPNs work I have a sneaking suspicion the answer is no, but it doesn't hurt to ask...)

    Read the article

  • Windows 7 64-bit Google Chrome Makes Desktop Unresponsive

    - by Meengla
    For the past 1-2 weeks my desktop becomes unresponsive sometimes. The problem seems to happen as I load a page in Google Chrome; it could be any page though today it happened when I tried to load a page with Flash content in it. The problem is not consistent and there is no pattern. I can do a virus scan using some software (not antivirus installed--never needed that) but I strongly suspect it is some software I installed or some Windows Update. This is a very powerful Dell Optiplex 990 with 16GB RAM, so Memory shouldn't be an issue. When the problem happens the cursor becomes spinning even over the taskbar and control alt delete takes a long time. Eventually I get a message that 'this program is not responding' with end/close and 'cancel' icon. But repeated end or cancel does nothing. Then the menu for ctrl alt del comes up. The rest of the applications keep running fine though I can't get to them because the cursor is the wait-cursor. What is happening? How can I find what exactly caused the problem? Thanks.

    Read the article

  • How to install a "virtual" network card on a virtual server?

    - by vikp
    Hi, We have purchased an unmanaged VPS windows hosting solution from one of the UK based companies. We have Windows Server 2008 Standard Edition. We need to install certain third party applications on that server. Unfortunatelly, one of the applications requires a MAC address to be present at all times - this is their way of making sure that software is not pirated (which it isn't). We have tried installing a virtual loopback network card, but this has brought the server down - i.e. we couldn't connect using remote desktop any longer. At the moment we are limited with what we can try. This is an unmanaged solution, therefore any support including restarts is rather costly. Are you aware of any low-risk solutions? Thank you

    Read the article

  • How to backup a large FreeNAS?

    - by Ze'ev
    We have a 12TB FreeNAS box in the office, and are looking for a way to keep a backup of it offsite. We're considering (1) tape; (2) a bunch of bare drives (popped into a spare hotswap bay); (3) external drives. Any advice on which solution is best? (Online backup is not an option because our internet connection is too slow.) And, is there some software that will keep track of which files have been backed up and which haven't? So that when one backup unit fills up, we can continue the backup on the next? (We don't want to have to back up to a 12TB device.) This software could run, preferably, on the NAS itself; or from one of our Mac clients. Our goal is a situation where we attach some backup device; it automatically fills up with stuff from the server; the contents of this unit are catalogued somewhere something prompts us to replace with a fresh drive/tape; backup continues until full, including any files that have changed since being backed up.

    Read the article

  • Juniper NetScreen NS-5GT traffic monitoring

    - by blah
    I've done casual research into the subject and am truly dismayed at the lack of compatible tools for such a simple task. Maybe someone can provide assistance. We have a NetScreen NS-5GT in the office. I need to be able to get a glance of current traffic per endpoint -- I think the equivalent of 'get sessions' with byte counts/rates. I don't care about bars, graphs, and reports. Something as simple as a classic software firewall display would be perfect. I can't shell out money on something real like SolarWinds products, so a free solution is essential. I'm willing to do a little work but refuse to program something from scratch. It's not prudent right now for me to install a hub or otherwise mess around physically. There must be something out there I can use, maybe in combination. I don't believe I'm asking too much. Specific answers only please, e.g. monitoring software you know will actually work with this antiquated device. I've read about general approaches to the broader problem dozens of times already.

    Read the article

  • While using an ntfs smb share for mac users, do symbolic links and extended attributes work?

    - by scape
    We have a majority of mac users but we'd rather support their file sharing using a Windows server with an ntfs drive, or at least a Linux server with ext3. We've had trouble, much trouble, utilizing the OS X server software and after the years are now looking to abandon it. What's mostly holding us back is the fact that the mac users very often utilize symbolic links and other special features that exist for an HFS+ partition. The shared locations are mostly primary storage and not just used as an archive storage location. While there is an option to create symbolic links under ntfs, I'm curious if there is anything I need to look out for if I were to move the files over to a new partition that's hosted from a Windows server from the HFS+ partition; in addition, how well creating a symbolic link from a mac might work. I am also worried about windows backup software and if it will ruin these special sym links, and how placing permissions on sub-folders will work. Alternatively I could remotely backup the files using a mac and Bru, nonetheless I still want to get away from mac server for hosting the shares.

    Read the article

  • How to locate phpmyadmin on ubuntu

    - by Chris
    Okay, I'm usually a windows user and I write quite happily there, unfortunately (or fortunately) I have installed linux on a dual boot and having installed some software I have a question... Where is it? I installed Apache, PHP, MySQL and separately phpmyadmin, Apache is up and running, I've seen my phpinfo page and MySQL is there. MySQL is telling me that there's a database for phpmyadmin, but...erm.. I can't seem to locate it. On a windows machine the directory would be in the www directory and I'd just navigate there... localhost/phpmyadmin/ but on Ubuntu I can't find it in the equivalent. I've been to /var/www/ and there's my index.html (from apache) and my phptest.php file but no phpmyadmin. There is a phpmyadmin in /lib but that only has 2 files in it. So having rambled lots, my question is, what do I have to do to be able to navigate to the phpmyadmin index page? I realise this could fall under the description of a server related question and should be posted elsewhere but as it's software on a home system some help would be appreciated. Do I need to move some files from somewhere? Help! I really don't want to have to go back to developing on Windows as I'll be deploying to a lamp system, my learning curve will be steep.

    Read the article

  • I would like to have a publicly accessable linux box hosted elsewhere. Who provides this service?

    - by Eric Wilson
    I would like to have a general purpose linux server available and publicly accessible. I understand that there are no lack of web-hosting companies, but I might want more control over the machine than is typical. I would want the ability to install software, such as an SVN server, and I would like to be able to expose various port numbers, as I may have a variety of extremely low traffic sites that I would want to have available. Obviously, one option is to host such a machine in my home. Is that my only option? Or is what I describe out there, possible as a virtual machine on a larger server?

    Read the article

  • Freshly installed dd-wrt on dir-300 and no internet. What to do?

    - by Erik B
    With the d-link firmware I just connected the router and it worked, as is expected when using DHCP, but with dd-wrt I have no internet access. It is configured with DHCP, and dd-wrt's wan status page reports that it is connected and that it has an ip address. Yet it is impossible to reach the internet. If I disconnect the router and plugs the cable directly into my computer I get internet access, so it's obviously the dd-wrt software that isn't doing its job. However, I have no previous experience with the dd-wrt software and have no clue what to look for. I thought it would just work. By the way, the power led is orange, the internet led is off, and wireless+lan1 is green. They all used to be green with the d-link firmware. Not sure if it's relevant, but now you know. Does anyone have any idea what I should do to get internet access (besides reinstalling d-link's firmware)? EDIT: It's a version B1. I read that it is very different from the A1 version, so I thought it may be relevant.

    Read the article

< Previous Page | 406 407 408 409 410 411 412 413 414 415 416 417  | Next Page >