Search Results

Search found 19848 results on 794 pages for 'multiple desktops'.

Page 59/794 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • Can't resolve offline file conflicts

    - by Bryan
    We use roaming profiles on our Server 2008 R2 domain, with folder redirection for 'desktop', 'my documents' and 'application data'. But as our network is split across two sites, we have one file server at each site, which are configured to use domain based DFS namespaces and DFS replication to keep things in sync. The DFS path for the replication folder is as follows: \\domain\folderredirection$\<username>\<redirected-folder-name> The real paths are \\site-1-server\folderredirection$\<username>\<redirected-folder-name> and \\site-2-server\folderredirection$\<username>\<redirected-folder-name> As our users all switch between sites (sometimes several time per day), our folder redirection policy has to redirect to the DFS roots rather than hardcoded to a specific server. Both DFS and DFS-R have been proven to be working perfectly. On our laptops, we use offline files for the redirected folders, and this also works fine, however the problem is as follows: When conflicts occur in offline files, it is impossible to resolve the conflicts. I'm given the usual conflict resolution options (i.e. 'Ignore', 'Keep Both', 'Keep network' and 'Keep local'), however, not one of these options will resolve any conflict, yet no error is produced. We only use offline files on laptops, which have either Windows XP Professional or Windows 7 Professional installed. The problem is not specific to any one laptop, it affects every laptop and every conflicting file in exactly the same way. I would have thought the set up we have is common for companies that have multiple sites, so I'm hoping someone will have seen this before?

    Read the article

  • How have multiple web servers and IPs on the same physical network

    - by jsigned
    I do web development out of a small office and need to have multiple physical and virtual servers that can be accessed from the internet. I also have a number of devices (computers, laptops, tablets, printers, etc) that need connections as well. I have gotten a subnet of 8 IP's from my ISP and while that is adequate for the web servers its far too small for everything that needs access to the network. My router is an ASUS RT-N16 running DD-WRT. I'm just smart enough about this routing topic to be dangerous, think 2 year old with a magic marker. I would like to keep my internal network NAT'ed on the 192.168.x.x network and route the 68.69.x.x 255.255.255.248 traffic directly to the servers. The physical network consists of the 4 port DD-WRT router and an unmanaged gig switch. I have a fiber connection to the office that works as an Ethernet port. In other words I can plug my laptop directly into it and have access to the internet. There is no login or password and the router is setup to get DHCP from the ISP, and to provide DHCP addresses for the internal network. What I've done so far is google and try different configurations with little success. In the end I decided I didn't even know how to ask the questions needed. My questions are: Is this the best way to configure the network? How do you do it? VLANs? Multiple routers? I've never had to configure a router using anything more than the GUI so if this is command line stuff be gentle.

    Read the article

  • One Comcast Business Gateway, One Router, Two Web Servers

    - by Kevin Scheidt
    I have a Comcast business account with a router and a web server (info) attached. behind the router there are multiple computers and a second web server (info) which also serves as a file server. (info) has two nics in it. One direct to comcast and one connected to the router. It needs to serve the world it's websites. It needs however, to also be able to see all the internal computers and (com)'s served files. With just 1 nic (the one connected to the router, not comcast), (info) works fine but no one outside can see it. (com) services port 80 and (info) needs to handle port 80 as well. I have two domain names registered, and 5 static ip's from comcast. right now h t t p: / /www.graceamazing.com handled by (com) works fine and h t t p: / /www.graceamazing.com:1307 handled by (info) works fine. but as soon as I enable the 2nd nic in (info) h t t p: / /www.graceamazing.info runs extremely slow (Horribly slow). however, h t t p: / /www.graceamazing.com:1307 and .com work fine. (com) has an ip address via the router 70.89.233.41 (info) has a ip addy of 70.89.233.46 via comcast (2nd nic) and a internal ip of 192.168.x.100 via static behind the router. Any suggestions or changes to make that will make h t t p: / /www.graceamazing.info perform with the same speed it has when going through h t t p: / /graceamazing.com:1307 is there a setting I should check / could have misssed?

    Read the article

  • ISC DHCPD IPv6 for multiple interfaces

    - by Seoman
    I want to assign multiple IPv6 to a server with multiple NIC. As IPv6 RFC defines, each server has a unique DUID that can have one of the 3 formats (LL, LLT or enterprise). And each NIC has an IAID. So a request from NIC1 its the DUID and the IAID of the NIC1 and the request from NIC2 its the same DUID but the IAID its different. The problem is that from a Centos box, when I ask for an IP in 2 different interfaces, I get the same IP. I can't find how to specify host entry based on DUID and the IAID. I see some people generating a unique DUID based on the MAC of the NIC but this is not IPv6 RFC says. What I tried is: host entry1 { host-identifier option dhcp6.client-id 00:01:00:01:19:fc:f8:1c:52:54:00:7e:c9:ec; option dhcp6.ia-na "00:09:40:5d"; fixed-address6 2001:db8:0:1::202; } host entry2 { host-identifier option dhcp6.client-id 00:01:00:01:19:fc:f8:1c:52:54:00:7e:c9:ec; option dhcp6.ia-na "00:7e:c9:ec"; fixed-address6 2001:db8:0:1::201; } This causes a Segmentation Fault in the client (what is scary...). I guess is not the right use for ia-na option but I don't see any other option.

    Read the article

  • Multiple static WAN IP addresses to single LAN subnet

    - by Jessy Houle
    Below is my home network topology. I currently have 5 static IP addresses, 3 of which are in use by 3 routers. These routers in-turn subnet internal networks and port forward. I use my SSL VPN appliance to remote home from work or on the road. At this point I can remotely administer my Windows Server. I know the network is setup wrong, I was matching existing hardware the best I knew how. http://storage.jessyhoule.com.s3.amazonaws.com/network_topology.jpg Ok this said, here is the problem... One of my websites on my Windows Server now needs to be secure (SSL using port 443). However, I'm already port forwarding port 443 to my VPN appliance. Furthermore, if I'm going to have to reconfigure the network, I would really like to be able to use the SSL VPN to remotely administer all machines. I mentioned this to a friend of mine, who said that what I was looking for was a firewall. Explaining that a firewall would take in multiple static (WAN) IP addresses, and still allow all internal devices to be on the same network. So, basically, I could supply my SSL VPN appliance it's very own static (WAN) IP address routing, and yet have it on the same internal network (192.168.1.x) as all my other devices. The first question is... Does this sound right? Secondly, would you suggest anything different? And, finally, what is the cheapest way to do this? I am started down the road of downloading/installing untangle and smoothwall to see if they will do the job, hoping they take multiple static (WAN) IP addresses. Thank you in advance for your answers. -Jessy Houle

    Read the article

  • How to Log Into a Web App Simultaneously with Different Account?

    - by Ngu Soon Hui
    I want to log into a web application, using at least ten account names at one single point of time ( I am not trying to do anything illegal, so don't worry). AFAIK, each tab in Chrome will share the same session, therefore, for one machine, one can use Google Chrome to log in at most 2 accounts, one in normal mode, another in Incognito mode. Is there anyway I can log into multiple accounts? I know I can open up IE and Firefox ( probably Safari etc) and login, but this is not really scalable as the number of web browsers is finite. Edit: My application is a localhost application; it resides on my computer. So proxy may not be that useful, and you now probably understand why it's nothing illegal. Edit2: CookieSwap seems like a good idea, but the problem is that once I swap the cookie, all the tabs and the FF apps' cookie are swap as well. Can the swapping be done on a tab basis or on application basis, so that on a dual-monitor, I can see the different login side-by-side?

    Read the article

  • How can I get Windows 7 to work with two Nvidia graphics cards with different drivers?

    - by Max
    This is similar to this question, but I am using more similar cards with Windows 7. I just purchased a Zotac Nvidia GeForce 7200 GS. I have a motherboard with two PCI Express x16 slots. There is already an MSI Nvidia GeForce 8800 GTS being used as the primary card, driving two LCD monitors. I would like the Zotac to output to a TV via DVI-out. Unfortunately, when Windows detects the Zotac and installs its drivers, or I manually install them, Windows stops being able to boot up. If I remove them and re-install the MSI 8800 drivers, I can boot again, but Windows can no longer see the Zotac 7200--it shows up as a yellow triangle in Device Manager. I've read conflicting reports about this. Some people claim that Windows 7 will support multiple heterogeneous graphics card drivers, as long as they are all using the same driver API ("WDDM?"). Others say that they have to be using the exact same driver, or it won't work. Others claim that you have to use the exact same card. which is it, exactly? I know I can run the MSI 8800 in SLI if I purchase another, but I don't need that kind of power--I just need HD-out to my television. I read somewhere that running two cards in SLI precludes you from using 100% of their output ports, so I'm not sure if that's an option. I suppose I could also run two MSI 8800's without SLI, but again, that's more power than I need (and more money than I'd like to spend). Also, I don't think this exact model is even manufactured anymore. Any ideas?

    Read the article

  • Multiple Devices connecting to VPN on CentOS server

    - by jfreak53
    I am looking for a solution as to what would be the VPN software for multiple OSes and Devices. I currently have 15 systems to connect to a VPN. I was using Hamachi from LogMeIn but their lack of Android support really upsets me, and their limited support for Linux OSes is also a let down. 90% of my systems are Ubuntu 11+ systems, only 2 are Windows XP. But I also have a few people, maybe 3 that need to connect to it from Android devices. This is where Hamachi has let me down and I want to move to my own VPN solution. The server would be a simple VPS running CentOS. So I need some VPN software that allows connection of those to a Linux based server. I wanted to go with OpenVPN, but I am under the opinion that in any OS you have to have their software to connect to the VPN. Ubuntu supports VPN's out of the gate, but OpenVPN requires extra software to be installed, I don't want this if I can help it. Same with Windows and same with Android. Plus android mostly requires rooted devices for OpenVPN, at least from what I've read. I was looking at maybe L2TP, but I'm not sure how easy it is to get Ubu systems connected with it as I haven't found much on the subject, let alone Window's XP machines. I know Android connects out of the gate to it. I don't know much about L2TP but I know it's a pain to get running in CentOS from what I have read. Now the last option is some sort of software for PPTP but I've never read anything on it and don't know if all systems are compatible with it. What would be your solution to these devices and multiple OSes? OpenVPN seems to be my heading I just don't like it that it always requires software to run and rooted Android Devices. Any solutions for this and install solutions? Maybe a different OS for the server like Ubuntu would make another type of VPN easier?

    Read the article

  • Multiheaded X.org with a single workspace-pool

    - by blauwblaatje
    I've got an idea for x.org/$randomwindowmanager in combination with a multiheaded setup, but I haven't figured out how it should work. Also I don't really know where to place the feature request. Now for the idea. I've been working with screen (wikipedia:GNU_Screen) for some years now. One thing I like about it, is the fact that I can get a multi-display mode (screen -x), so you can have multiple terminals all connected to the same screen. The fun thing about it, is that you can get 2 terminals with the same content and switch my onscreen layout, without moving the terminals. I admit, in screen it's not extremely useful, but I think for a wm it can be. Imagine this. You've got two monitors and 4 workdesks. On one workdesk I've got my IDE with code, on the second one I've got the output, on the third one I've got the documentation and on the forth one I've got my e-mail and IM clients. At one moment, I want my IDE and output on my monitors, another moment my code and documentation and Yet another moment my IM to consult a colleague and documentation or code. Finally my colleague comes to help me at my desk. I'd like it if we could both watch the same workdesk without him sitting on my lap, so I turn one monitor so he can see it better. It would be great if we could see the same thing that's on my monitor (exclude mousepointer). The thing with most WMs is that your workspaces on the two monitors are either separated or glued together. If they're separated, you can change workspaces on each monitor autonomous, but you can't exchange applications between monitors because they're different x-clients (iirc). If they're glued together (xinerama), you can exchange the applications, but when changing your workspace, the other monitors change too. So, what I'd like to know is this. Is this already possible or should I submit a feature request somewhere (and if so, where?)

    Read the article

  • Word 2010 creates multiple processes... sometimes

    - by Bill Sambrone
    I've run into a strange behavior when I migrated our users from Office 2007 / Vista to Office 2010 / Windows 7 (all 32-bit). They use a web based document management system called NetDocuments which stores all their .doc/.docx files. Generally, when they click on a doc from the browser window it fires up Word and opens the doc. Word has an add-in in it from NetDocs as well so it can upload the changed document directly back to the NetDocs server. I get a phone call when Word crashes, and every single time it has crashed I have witnessed multiple winword.exe processes running in task manager. I used process explorer to see what created the process, and it is all Internet Explorer. So far I have rolled them back to IE8 and the problem happens less frequently, but it still happens. When I try to duplicate the problem, I can make it happen sometimes if I open multiple documents very quickly. Using lightning fast alt-tab reflexes, I DO see that a 2nd WinWord process is created when a user clicks on a document, then it closes once the document is open. I think what is happening is that the secondary WinWord process that does some sort of NetDocs voodoo is getting stuck open. This behavior is new to Word 2010 / Windows 7 and google searching isn't coming up with much. I have seen a few posts that this is a known issue in certain circumstances and there is no "fix", but I thought it would good to ask others on this. Thanks!

    Read the article

  • Host multiple domains with Apache on a VPS

    - by Kunal
    Hi I have recently bought a windows VPS and installed apache, mysql using xampp. All services are running fine. I can access the hosted site using the IP of VPS but what I need to do is host multiple domains on that server. Actually my requirement was to use htaccess on a php based site but the site had to access data from ms sql server as well. So needed to enable php_mssql.dll in the php.ini which no shared hosting was supporting. Had to go for this VPS. Plesk was installed by default but as htaccess wont work with IIS, I had to stop IIS service and install apache there. Now all is well, but I need to find a way to host multiple domains there. When I bought the VPS they hosting people sent me the dedicated IP of the server and also I have the 2 name servers required to host domains. What is the next step? Exactly which file I need to modify to get things done? Please help! Any kind of help is much appreciated. Thanks in advance for all of your kind help and time.

    Read the article

  • Making it Easier for Older Users to Login to Multiple Accounts

    - by Mike Hagstrom
    I currently do consulting for a small business that has multiple applications that they need to login too. I'm trying to get them to start using Basecamp and Zendesk to make all of our lives easier when it comes to collaboration on big projects and quick helpdesk ticket items. However, I have recently been informed that it is difficult for them to remember all of these websites etc... to login too. However the login information is the same. Right now they have to login to: Windows Login Gmail I want them additionally to login to Basecamp Zendesk This is just a generation or two gap between myself and them, so I'm wondering what others do to solve these problems. Is there some way we could configure USB thumbdrives that somehow have Lastpass or something on that when plugged into the computer automatically log them into their Windows account, then when they were to say visit the Basecamp account would automatically log them into that? I think the security risk (of a list thumbdrive) is well worth the ability to use these extra applications. Unless anyone else has any other ways for making it easier for users to login to multiple sites.

    Read the article

  • Can MySQL use multiple data directories on different physical storage devices

    - by sirlark
    I am running MySQL with its data dir on a 128Gb SSD. I am dealing with large datasets (~20Gb) that are loaded and processed weekly, each stored in a separate DB for the purposes of time point comparisons. Putting all the data into a single database in unfeasible because the performance on such large databases is already a problem. However, I cannot keep more than 6 datasets on the SSD at a time. Right now I am manually dumping the oldest to much larger 2Tb spinning disk every week, and dropping the database to make space for the new one. But if I need one of the 'archived' databases (a semi regular occurrence) I have to drop a current one (after dumping), reload it, do what I need to, then reverse the results. Is there a way to configure MySQL to use multiple data directories, say one on the SSD and one on the 2Tb spinning disk, and 'merge' them transparently? If I could do this, then archiving would no longer mean "moved out of the database entirely", but instead would mean "moved onto the slow physical device". The time taken to do my queries on a spinning disk would be less than that taken to completely dump, drop, load, drop, reload two entire databases, so this is a win. I thought of using something like unionfs but I can't think of a way to control which database gets stored on which physical drive, because it works by merging on a directory level (from what I understand) so I'm still stuck with using multiple directories. Any help appreciated, thanks in advance

    Read the article

  • Creating multiple SFTP users for one account

    - by Tom Marthenal
    I'm in the process of migrating an aging shared-hosting system to more modern technologies. Right now, plain old insecure FTP is the only way for customers to access their files. I plan on replacing this with SFTP, but I need a way to create multiple SFTP users that correspond to one UNIX account. A customer has one account on the machine (e.g. customer) with a home directory like /home/customer/. Our clients are used to being able to create an arbitrary number of FTP accounts for their domains (to give out to different people). We need the same capability with SFTP. My first thought is to use SSH keys and just add each new "user" to authorized_keys, but this is confusing for our customers, many of whom are not technically-inclined and would prefer to stick with passwords. SSH is not an issue, only SFTP is available. How can we create multiple SFTP accounts (customer, customer_developer1, customer_developer2, etc.) that all function as equivalents and don't interfere with file permissions (ideally, all files should retain customer as their owner)? My initial thought was some kind of PAM module, but I don't have a clear idea of how to accomplish this within our constraints. We are open to using an alternative SSH daemon if OpenSSH isn't suitable for our situation; again, it needs to support only SFTP and not SSH. Currently our SSH configuration has this appended to it in order to jail the users in their own directories: # all customers have group 'customer' Match group customer ChrootDirectory /home/%u # jail in home directories AllowTcpForwarding no X11Forwarding no ForceCommand internal-sftp # force SFTP PasswordAuthentication yes # for non-customer accounts we use keys instead Our servers are running Ubuntu 12.04 LTS.

    Read the article

  • Extend Linux Desktop to another X Windows Display

    - by unknown (google)
    Hello, I am a long time Linux user of the Xinerama and other technologies for extending a desktop to multiple monitors. However when I travel with my laptop I miss the multi-monitor support I enjoy at home. Recently I acquired a second laptop for a low price. Both laptops are running Fedora (versions 10 and 11 respectively). I use Gnome as my primary desktop environment. I know about synergy. I use synergy all the time to control the screen of other Windows / Linux systems I use. I would like to know, can I sit both my primary and secondary laptops together and achieve a Xinerama-like extended desktop environment? Ideally I would like to start a GNOME session on my primary laptop. And then start a X-Windows Desktop on my secondary laptop and extend my primary laptop's desktop onto it. I would like to be able to move Windows from the primary desktop to the secondary laptop desktop. Would I need to use synergy to do this with some other bit of X-Windows technology? Or is there X-Windows technology that will do all this for me? I am familiar with X Windows ability to display applications remotely. I am also familiar with Nomachine's NoX.

    Read the article

  • 3 Monitor PCI-e Graphics card on Linux (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. The exact message is: The current settings cannot be applied. Possible issues may include: - Display(s) cannot be enabled. - Setting(s) cannot be applied due to insufficient video memory. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment.

    Read the article

  • HAProxy, health checking multiple servers with different host names

    - by Marco Bettiolo
    I need to load balance between multiple running servers with different host names. I cannot set-up the same virtual host on each one. Is it possible to have only one listen configuration with multiple server and make the Health Checks apply the http-send-name-header Host directive? I am using HAProxy 1.5. I came up with this working haproxy.cfg, as you can see, I had to set a different hostname for each health check as the health check ignores the http-send-name-header Host. I would have preferred to use variables or other methods and keep things more concise. global log 127.0.0.1 local0 notice maxconn 2000 user haproxy group haproxy defaults log global mode http option httplog option dontlognull retries 3 option redispatch timeout connect 5000 timeout client 10000 timeout server 10000 stats enable stats uri /haproxy?stats stats refresh 5s balance roundrobin option httpclose listen inbound :80 option httpchk HEAD / HTTP/1.1\r\n server instance1 127.0.0.101 check inter 3000 fall 1 rise 1 server instance2 127.0.0.102 check inter 3000 fall 1 rise 1 listen instance1 127.0.0.101:80 option forwardfor http-send-name-header Host option httpchk HEAD / HTTP/1.1\r\nHost:\ www.example.com server www.example.com www.example.com:80 check inter 5000 fall 3 rise 2 listen instance2 127.0.0.102:80 option forwardfor http-send-name-header Host option httpchk HEAD / HTTP/1.1\r\nHost:\ www.bing.com server www.bing.com www.bing.com:80 check inter 5000 fall 3 rise 2

    Read the article

  • Linux: 3 Monitor PCI-e Graphics card (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. I'll post the exact error message here soon, but it's very useless. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment. Thanks, Nick

    Read the article

  • Nginx Multiple If Statements Cause Memory Usage to Jump

    - by Justin Kulesza
    We need to block a large number of requests by IP address with nginx. The requests are proxied by a CDN, and so we cannot block with the actual client IP address (it would be the IP address of the CDN, not the actual client). So, we have $http_x_forwarded_for which contains the IP which we need to block for a given request. Similarly, we cannot use IP tables, as blocking the IP address of the proxied client will have no effect. We need to use nginx to block the requested based on the value of $http_x_forwarded_for. Initially, we tried multiple, simple if statements: http://pastie.org/5110910 However, this caused our nginx memory usage to jump considerably. We went from somewhere around a 40MB resident size to over a 200MB resident size. If we changed things up, and created one large regex that matched the necessary IP addresses, memory usage was fairly normal: http://pastie.org/5110923 Keep in mind that we're trying to block many more than 3 or 4 IP addresses... more like 50 to 100, which may be included in several (20+) nginx server configuration blocks. Thoughts? Suggestions? I'm interested both in why memory usage would spike so greatly using multiple if blocks, and also if there are any better ways to achieve our goal.

    Read the article

  • 3 Monitor PCI-e Graphics card (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. The exact message is: The current settings cannot be applied. Possible issues may include: - Display(s) cannot be enabled. - Setting(s) cannot be applied due to insufficient video memory. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment.

    Read the article

  • "merging" multiple internet connections

    - by Spencer R
    I've seen this question asked several times here on SF, but I'm looking for some updated information; specifically concerning Server 2012. I'm in the process of buying a home so I'm trying to get some plans together on how I want to structure my network. Internet speeds aren't the greatest and connections can be unreliable where the house is so I was thinking of having two DSL lines installed. My question is, how could I leverage those two connections to create the best network I can, in terms of speed and reliability. My parents will be moving in with me - they consume a lot of bandwidth as it is, but then add my internet traffic to it, and I'm headed for a lot of frustration. I thought I remember reading somewhere that Server 2012 has some new functionality to utilize multiple connections on multiple NICs in a way that wasn't possible in earlier versions of Server. Not sure if Windows will work but, I'm an application developer and spend the majority of my time in Windows environments. However, I've only recently returned to the Windows world, so I'd like my main server at home to run Win Server 2012 so that I can become more familiar with it.

    Read the article

  • How to securely connect to multiple different LDAPS servers (Debian)

    - by Pickle
    I'm trying to connect to multiple different LDAPS servers. A lot of the documentation I've seen recommends setting TLS_REQCERT never, but that strikes me as horribly unsecure to not verify the certificate. So I've set that to demand. All the documentation I've seen says I need to update ldap.conf with a TLS_CACERT directive pointing to a .pem file. I've got that .pem file set up with the certificate from LDAP Server #1, and ldaps connections are happening fine. I've now got to communicate securely with another LDAP server in another branch of my organization, that uses a different certificate. I've seen no documentation on how to do this, except 1 page that says I can simply put multiple (not chained) certificates in the same .pem file. I've done this and everything is working hunky dorey. However, when I told a colleague what I did, he sounded like the sky was falling - putting 2 non-chained certificates into one .pem file is apparently the worst thing since ... ever. Is there a more acceptable way to do this? Or is this the only accepted way?

    Read the article

  • Permissions for Multiple User VPS

    - by adnymarc
    I have a Linode VPS server that I have recently setup and am migrating to from Mediatemple, where I have a VPS managed by Plesk. I dislike the Plesk interface and the mess it makes of a lot of things, but appreciated its ability to allow multiple people access to different domains on a server. I have most everything setup the way I would like it, but am having issues with permissions for my domain directories. I am running Ubuntu 8.04 LTS and Apache 2 as my web server. I have domains successfully located in /var/www/vhosts/domainname.com but have to modify files as root in order to add/change files for the domains. I would like to setup access with the following criteria: Each domain can have a user assigned to it (and allow for the same user to manage multiple domains - could even create symlinks in their home folder to their domains) Certain users will have shell access and may be chrooted to the domain directory they control FTP needs to be setup and able to correctly access the domains so that content editors for each domain can upload/download without permissions issues I am relatively new to linux sysadmin and have searched for a good guide to help solve these issues but haven't been able to find one yet. Thanks in advance for your help.

    Read the article

  • Asterisk relay between multiple subnets

    - by immoune
    I wonder what's the best way to go when you have phones on multiple networks which are not directly reachable. I have 3 networks 10.3.x.x 10.6.x.x 10.17.x.x My asterisk server resides on the 10.3.0.5 IP. The machines from the 10.6 and 10.17 networks are routed here through VPN tunnels. At this point we don't talk about NAT anywhere on the network just pure routing. Since the 10.3.0.5 PBX has routes back to all the subnet's it has no problem to communicate with softphones/hardphones from these ranges. The problem comes from that Asterisk (as far as I understand) only responsible for the SIP communication part not the Audio/Video transmission which is in P2P fashion done between the devices. So although a client using sipdroid from 10.6.x.x is able to connect to the pbx (10.3.0.5) and dial a bria client on the 10.17.x.x network once the phone rings out and the call establishes no audio will be transmitted simply because it has no way to directly connect there. For this there are multiple solutions described in this text: http://msdn.microsoft.com/en-us/library/ee480411%28v=winembedded.60%29.aspx What I would prefer is to keep these networks segregated as they are now. What would be the best solution? Is it possible to actually relay through all the audio/video information through the Asterisk server? That would be the best in my case, I using Astlinux there which has a lot of other parts. Thanks

    Read the article

  • Supporting multiple instances of a plugin DLL with global data

    - by Bruno De Fraine
    Context: I converted a legacy standalone engine into a plugin component for a composition tool. Technically, this means that I compiled the engine code base to a C DLL which I invoke from a .NET wrapper using P/Invoke; the wrapper implements an interface defined by the composition tool. This works quite well, but now I receive the request to load multiple instances of the engine, for different projects. Since the engine keeps the project data in a set of global variables, and since the DLL with the engine code base is loaded only once, loading multiple projects means that the project data is overwritten. I can see a number of solutions, but they all have some disadvantages: You can create multiple DLLs with the same code, which are seen as different DLLs by Windows, so their code is not shared. Probably this already works if you have multiple copies of the engine DLL with different names. However, the engine is invoked from the wrapper using DllImport attributes and I think the name of the engine DLL needs to be known when compiling the wrapper. Obviously, if I have to compile different versions of the wrapper for each project, this is quite cumbersome. The engine could run as a separate process. This means that the wrapper would launch a separate process for the engine when it loads a project, and it would use some form of IPC to communicate with this process. While this is a relatively clean solution, it requires some effort to get working, I don't now which IPC technology would be best to set-up this kind of construction. There may also be a significant overhead of the communication: the engine needs to frequently exchange arrays of floating-point numbers. The engine could be adapted to support multiple projects. This means that the global variables should be put into a project structure, and every reference to the globals should be converted to a corresponding reference that is relative to a particular project. There are about 20-30 global variables, but as you can imagine, these global variables are referenced from all over the code base, so this conversion would need to be done in some automatic manner. A related problem is that you should be able to reference the "current" project structure in all places, but passing this along as an extra argument in each and every function signature is also cumbersome. Does there exist a technique (in C) to consider the current call stack and find the nearest enclosing instance of a relevant data value there? Can the stackoverflow community give some advice on these (or other) solutions?

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >