Search Results

Search found 17519 results on 701 pages for 'live environment'.

Page 106/701 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Enabling printing feature within the Terminal Server environment that is published to the internet?

    - by Albert Widjaja
    I got the home and remote office users connect to the Terminal Server on my Windows Server 2003 that I published securely through Juniper SSL VPN client applet, they use normal internet connection to access the link which pop up the Terminal Server Remote Desktop application, so my question is, how can they print out the document from within their terminal server session ? if it is going through the internal office LAN mapping the printer through Remote Desktop connection is the solution but not for this one. Any kind of help and suggestion would be greatly appreciated. Thanks

    Read the article

  • Setup for a live (low-latency) audio video broadcast over Wi-Fi?

    - by Majal Mirasol
    The Upgrade We are capturing audio (from mixer) and video (from a camera) from a main auditorium and passing it to separate rooms within the building. We used to have done this via manual audio/video cables and wires. We wanted to "upgrade" the system and wirelessly broadcast the stream via Wi-Fi. The Problem In our current setup (Wirecast running on A10 on a Wireless-N network), we have the problem of delay. Our streams are delayed from a minute up to five minutes on the clients (laptop/iPad/Android). This had not been a problem from the previous wired connections. Since the wireless network is local, we thought that a delay of less than a second should be achievable. Our Question And so it goes. Anybody there who has any experience for a setup that has both low latency and at the same time user-friendly to clients streaming in the program? Any recommendations would be highly appreciated. (Our current setup in on Windows 7, but setup on a dedicated Linux box is preferred, if achievable.)

    Read the article

  • How do you setup FTP with IIS Manager Users in an NLB environment with shared IIS configs?

    - by William Jens
    I've setup a 2 node NLB cluster and used the following to share IIS configs between them. http://blogs.technet.com/b/meamcs/archive/2012/05/30/configuring-iis-7-5-shared-configuration.aspx The IIS configs and content is located on a network share via a UNC path. This works - updating IIS settings on one node, is visible in another node and my website works on the individual nodes and the cluster as whole. I'm able to setup an FTP site and successfully connect with my Windows login. However, I want to use IIS Manager Authentication as defined in: http://www.iis.net/learn/publish/using-the-ftp-service/configure-ftp-with-iis-manager-authentication-in-iis-7 I've tried using "Network Service" with the FTP COM object as well as a dedicated user account that exists on all three hosts, but every time I try to login with an IIS user I get something like the following: IISWMSVC_AUTHENTICATION_UNABLE_TO_READ_CONFIG An unexpected error occurred while retrieving the authentication information. Exception:System.Runtime.InteropServices.COMException (0x8007052E): Filename: Error: at Microsoft.Web.Administration.Interop.AppHostWritableAdminManager.GetAdminSection(String bstrSectionName, String bstrSectionPath) at Microsoft.Web.Administration.Configuration.GetSectionInternal(ConfigurationSection section, String sectionPath, String locationPath) at Microsoft.Web.Management.Server.ConfigurationAuthenticationProvider.GetSection(ServerManager serverManager) Process:dllhost User=NT AUTHORITY\NETWORK SERVICE Can anyone point me in the right direction here?

    Read the article

  • A good Linux alternative to Ubuntu for a working environment.

    - by Roozak
    Hi, Im running a decent laptop with 3GB ram and 2GHz Core Duo. I use it mainly for working which requires several SSH and SFTP connections to servers and running a VM most of the time. Nothing much more intensive than that. I like using Ubuntu 9.10 however I have the issue of Nautilus and the top/bottom menus freezing up on me constantly - dare I say it, alot more problems than when I was using Windows Vista. Im just looking for recommendations of other operating systems which would be suitable for the task. Thanks

    Read the article

  • How to connect to WPA2 encrypted wireless-network when booted from CloneZilla Live-CD?

    - by caligula
    My intention is to perform a backup of my laptop's (Dell Vostro 3350) sda1 disk to my desktop. After some googling I decided to use CloneZilla for that purpose. I have an OpenSSH server installed and running on my desktop. So I inserted CloneZilla CD to CD-rom, booted from it, then chose an option smth like "use ssh server to store image". Then I got an invitation to choose a network interface. I chose wlan0 and entered shell to manualy configure connection. And that's where I got into trouble, for wifi-network which I wanted to use is WPA2-encrypted, and I don't know how to connect to it from command line. Can somebody assist me. Appreciation for advance.

    Read the article

  • What program will monitor the chords I'm playing through my MIDI keyboard live?

    - by Jasper
    Leaving Cubase 5 out of the picture, when I use Reason 4.0 to compose something, I need a program running that will monitor the chords I'm pressing WHILE I'm pressing them. See, I'm a new keyboardist, and a thing like this will improve my accuracy by 100%. At one point, I'll stop needing such programs even, but for now, I'd REALLY like to have something of that sort. Free or commercial suggestions.

    Read the article

  • Recommendations for VMWare web server environment with load balancer.

    - by Ben
    We run IIS websites on a VMWare production server that pull image content and video content from a separate IIS instance on another server (media server). The media calls (images and video) are straight http:// calls and not using a streaming application. During peak traffic periods, we clone the production server five times and have a load balancer distribute traffic to all five production servers. The media server does not get ramped up. We noticed that the processing and resources on the media server gets very taxed during this period. Would it make sense to run the IIS instance for the media server locally on the production server and have it cloned with the production servers, then have a rule on the load balancer negotiating these media calls from the website? Would it be better to allocate more resources (memory and CPUs) to the media server VM and not clone it with the production servers? Recommendations are sincerely appreciated.

    Read the article

  • How to configure catch-all in Exchange2010 hub-transport environment?

    - by Itay Levin
    Im getting the delivery failed from the post master reply. i don't want it. because then poeple can find out all my real users on the exchange. also, i have a lot of users (10K) in my application - and i don't want to create a mailbox for each user. is it possible to get this done in ex2010 sp1. hub-transport configuration? or i must use edge-transport as indicated in http://technet.microsoft.com/en-us/library/bb691132(EXCHG.80).aspx

    Read the article

  • Can I save an Apache environment variable value with SetEnv?

    - by Nicholas Tolley Cottrell
    I am running Apache 2.2 with Tomcat 6 and have several layers of URL rewriting going on in both Apache with RewriteRule and in Tomcat. I want to pass through the original REQUEST_URI that Apache sees so that I can log it properly for "page not found" errors etc. In httpd.conf I have a line: SetEnv ORIG_URL %{REQUEST_URI} and in the mod_jk.conf, I have: JkEnvVar ORIG_URL Which i thought should make the value available via request.getAttribute("ORIG_URL") in Servlets. However, all that I see is "%{REQUEST_URI}", so I assume that SetEnv doesn't interpret the %{...} syntax. What is the right way to get the URL the user requested in Tomcat?

    Read the article

  • How to set up hosts file for local environment?

    - by n00b0101
    I'm trying to create subdomains on my localhost and am way out of my territory... I'm running MAMP on my Mac OS X and I thought/think I had/have to do the following: (Assuming I want to create me.localhost.com and you.localhost.com) (1) Edit /private/etc/hosts Right now, it looks like this: 127.0.0.1 localhost 255.255.255.255 broadcasthost ::1 localhost fe80::1%lo0 localhost So, do I just make it: 127.0.0.1 localhost 127.0.0.1 me.localhost.com 127.0.0.1 you.localhost.com 255.255.255.255 broadcasthost ::1 localhost fe80::1%lo0 localhost (2) I'm assuming I don't need to mess with DNS at all because it's local? So, the hosts file should suffice? (3) And then, I need to edit my httpd.conf file to include virtual hosts? I tried this, but it's not picking it up... NameVirtualHost * <VirtualHost *> DocumentRoot "/Applications/MAMP/htdocs" ServerName localhost </VirtualHost> <VirtualHost *> DocumentRoot "/Applications/MAMP/htdocs/me.localhost.com" ServerName me.localhost.com </VirtualHost> <VirtualHost *> DocumentRoot "/Applications/MAMP/htdocs/you.localhost.com" ServerName you.localhost.com </VirtualHost> Not sure if I'm way off-base here... Help is greatly appreciated!

    Read the article

  • In an environment with multiple WiFi access points, do wireless clients sometimes connect to both at the same time?

    - by Bobby Burgess
    This is more of a curiosity than a problem, but in this new office I have two D-link DAP-2553's connected in a master/slave array (this just means the master keeps certain configuration options aligned with the slave). The network is set to 802.11n-only, and each AP has the same SSID and WPA2 key. The only difference is that they are on different channels (1 and 11). The WiFi network itself is working well. Users can roam around and the signal/speed is fairly consistent. However, I notice that when I look at the 802.11 client list in the web admin page for each of the 2 APs, I see that certain clients are connected to both, for extended periods of time, but I assume they are only passing data through one of them. Not every client is seen on each AP, but at any given time the same MAC address of a WiFi adapter can be associated (and remain associated) with both APs. The client list auto-refreshes every few seconds so I believe I'm looking at the most recent rather than stale information. One of the WiFi adapters that consistently associates with both APs is an Intel Centrino Wireless-N 1030 (laptop chip). Is it part of the WiFi standard that more than one association per WiFi card can be established concurrently on separate APS?

    Read the article

  • Virtual environment firewall with CSF + iptables rules on VM?

    - by luison
    We are getting into virtualization with a Proxmox VE (OpenVZ + KVM) server. Our plan for firewall is to have CSF (http://configserver.com/cp/csf.html) running on the host machine as we've had a reasonable good experience with it in the past. Apart from that we plan simple firewall rules on the VM machines (mostly OpenVZ containers with same kernel) and maybe fail2ban simple specific rules. I would appreciate comments with anyone with similar experiences? I understand all traffic comes via the host machine so a combined firewall there with specific firewalling on the VM should work, alltough some iptables rules are hard to get to work on OpenVZ containers.

    Read the article

  • How to set a page as default for apache, that doesn't live in the directory and only applies if ther

    - by Kyle
    I know. This sounds complicated =D I made a PHP file browser, as an alternative to the apache one. I needed it for logic purposes, it does extra things for me, &etc. So instead of dropping this file in all of my directories, how could I get it to "show up" in all my directories that don't have an index (would use the apache dirlisting by default)? Thanks for the help! Edit I wonder if this could be done using Alias and DirectoryIndex? Is it possible to alias to a file?

    Read the article

  • How to organize deployment process in Chef-controlled environment?

    - by Alex
    I have a web Linux-based infrastructure which consists of 15 virtual machines and over 50 various services. It is fully controlled by Chef. Most of the services are developed internally. Basically the current deployment process is triggered by a shell script. A build system (a mix of Python and shell scripts) packages the services as .deb files and puts these packages into a repo. It runs apt-get update on all 15 nodes then because the standard Chef apt cookbook only runs apt-get once per day and we definitely do not want to run apt-get update unconditionally on each chef-client wake. The build system restarts chef-client daemons on all 15 nodes finally (we need this step because of pull Chef nature). The current process has a number of drawbacks we want to address. First off, it is asynchronous because the deployment script does not check chef-client logs after restart so we don't even know if the deployment was successful. It does not even wait for Chef clients to complete the cycle. Second, we definitely do not want to force chef-client restarts on all nodes because we usually deploy only a small number of packages. And third, I am not quite sure using chef-client for deployment is legitimate, probably we are just doing it wrong from the start. Please share your thoughts/experience.

    Read the article

  • Linux Live CD only works when Windows is in Legacy mode?

    - by Vee
    I have asked a similar question before and no one was able to help me but I think it was because I wasn't phrasing it properly. This is a better restatement of the question. I have Windows 8 and Linux Mint dual booted on my pc. When I tried to boot the Linux from a CD ROM only, it would give me the following error: error: failure reading sector 0x0 from 'hd1' error: you need to load the kernel first. Press any key to continue... The Linux Mint works fine but otherwise, but it gives this error when I try to boot from CD. The boot Linux from CD only worked when I changed the Windows to Legacy mode in the BIOS settings. When I changed it back to UEFI, it would give the same error. Why is this? How can I fix it? I am somewhat new so is there anything else I should know about all of this? NOTE: I changed the Linux into UEFI mode using boot-repair but that still did not solve the problem when I tried to boot from CD ROM.

    Read the article

  • How to secure an Internet-facing Elastic Search implementation in a shared hosting environment?

    - by casperOne
    (Originally asked on StackOverflow, and recommended that I move it here) I've been going over the documentation for Elastic Search and I'm a big fan and I'd like to use it to handle the search for my ASP.NET MVC app. That introduces a few interesting twists, however. If the ASP.NET MVC application was on a dedicated machine, it would be simple to spool up an instance of Elastic Search and use the TCP Transport to connect locally. However, I'm not on a dedicated machine for the ASP.NET MVC application, nor does it look like I'll move to one anytime soon. That leaves hosting Elastic Search on another machine (in the *NIX world) and I would probably go with shared hosting there. One of the biggest things lacking from Elastic Search, however, is the fact that it doesn't support HTTPS and basic authentication out of the box. If it did, then this question wouldn't exist; I'd simply host it somewhere and make sure to have an incredibly secure password and HTTPS enabled (possibly with a self-signed certificate). But that's not the case. That given, what is a good way to expose Elastic Search over the Internet in a secure way? Note, I'm looking for something that hopefully, will not require writing code to provide shims for the methods that I want (in other words, writing forwarders).

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >