Search Results

Search found 25585 results on 1024 pages for 'multiple variables'.

Page 180/1024 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • How to host multiple FLEX applications in IIS7

    - by Devtron
    Hello, I manually deploy a FLEX application to my web server (IIS 7). There are two virtual directories, 1.) Default 2.) myFlexApp1. myFlexApp1 is where my working FLEX application resides. I now need to deploy a different FLEX application (let's call it myFlexApp2) to the same web server. I set up a virtual directory for [myFlexApp2] and it complains about the "bindings" using port 80, which is already used by [myFlexApp1]. I have tried to give them separate host names in their bindings properties. For example, myFlexApp1.mydomain.com and myFlexApp2.mydomain.com. I can never get [myFlexApp2] to show from an external browser. I was able to get only one or the other to display, but never could run both. Here is what I need: myFlexApp1.mydomain.com -- myFlexApp1 calendar.mydomain.com -- myFlexApp2 test.mydomain.com -- myFlexApp1 where test.mydomain.com is the default URL. Is this possible? What am I doing wrong? I even tried to edit the hosts file in [C:\Windows\System32\drivers\etc] but that didnt work either. How can I serve up two FLEX applications on IIS 7? It shouldn't be this hard!

    Read the article

  • Automatically distinguish difference between multiple HDDs in linux?

    - by Jakobud
    I'm running Ubuntu Server 9.10. I have two external USB HDDs. I use them each for different backup reasons. So certain data gets stored on one HDD, and different information gets stored on the other HDD. I want to make a script that can look at the external HDD can determine which HDD it is, so that it can copy the proper information to it. Is there a way for Linux to determine this? Like if I see one HDD as /dev/sdc1, then unplug it and plug in the other HDD, should Linux see it as /dev/sdd1 or will it be /dev/sdc1? I'm a bit of a Linux newb and I don't quite understand how it determines the /dev/sdxx values that it assigns to drives.

    Read the article

  • email archive for multiple users

    - by evanmcd
    Hi, I'm moving a web site from one server to another, and am realizing that I need to move the name servers for the domain as well (they are set to the current host, not to the registrar). So, knowing that email services will stop as soon as I switch the DNS, I'm scrambling to figure out how to archive and make available email data for folks that have mostly been using webmail for the past few years, and may not even have a computer on which to install a client to download the mail to. What does one do in this situation? Thanks for any help offered! Evan

    Read the article

  • Permissions in OS X for iTunes library with multiple users

    - by John
    I currently have a lot of music on an external drive and my iTunes set up from there. However, periodically, when the external drive isn't connected, iTunes will default back to the library location of my home directory user path. I don't want to mess with an external drive, as my Mac's HD is large enough to house the music collection. However, I have 4 family members – all with their own logins – using this same gob of music. I don't want four copies of the library, only one with all libraries referencing it. So, what I want to do is: Move all music files to a shared directory at /Macintosh HD/users/music. I created this directory and adjusted permissions, so all four users can read and write to this directory. Get all four accounts to reference this library instead of the external or local home locations I am hoping I can just check the box to keep library organized in my account, which is the admin and let iTunes move it all. Then delete current libraries for each account and re-add from the new shared location. Will the iTunes organization process cause permissions issues either by setting permissions to all the files access to my account only or write permissions or any other 'gotcha'? I am having a hard time coming up with a smooth solution that won't break everything and cause me to have mega duplicates or access issues. I would prefer not to do any XML library file editing if possible. Am I dreaming?

    Read the article

  • Best practice for scaling a single application source to multiple nodes

    - by Andrew Waters
    I have an application which needs to scale horizontally to cover web and service nodes (at the moment they're all on one) but interact with the same set of databases and source files (both application code and custom assets). Database is no problem, it's handled already with replication in MongoDB. Also, the configuration of the servers are the same (100% linux). This question is literally about sharing a filesystem between machines so that its content is always correct, regardless of the node accessing it. My two thoughts have so far been NFS and SAN - SAN being prohibitively expensive and NFS seeing some performance issues on the second node with regards to glob()ing in PHP. Does anyone have recommended strategies or other techniques that don't involved sharding data across nodes or any potential gotchas in NFS that may cause slow disk seek times? To give you an idea of the scale, the main node initialises it's application modules in ~ 0.01 seconds. The secondary is taking ~2.2 seconds. They're VM's inside a local virtual network in ESXi and ping time between them is ~0.3ms

    Read the article

  • Multiple websites each with an SSL certificate of its own

    - by ServerDown
    Hi, We run cent os, plesk with apache and php, mysql. There are around 25 sites and each of them need an SSL certificate now. The host cannot have more than 16 IPs on the same server. Is it possible to have all these sites use just one IP address and have SSL certificate setup for each site? If yes, please let me know how I can set this up. Thanks

    Read the article

  • Android: Having trouble creating a subclass of application to share data with multiple Activities

    - by Mike
    Hello, I just finished a couple of activities in my game and now I was going to start to wire them both up to use real game data, instead of the test data I was using just to make sure each piece worked. Since multiple Activities will need access to this game data, I started researching the best way to pass this data to my Activities. I know about using putExtra with intents, but my GameData class has quite a bit of data and not just simple key value pairs. Besides quite a few basic data types, it also has large arrays. I didn't really want to try and pass all that, unless I can pass the entire object, instead of just key/data pairs. I read the following post and thought it would be the way to go, but so far, I haven't got it to work. Android: How to declare global variables? I created a simple test app to try this method out, but it keeps crashing and my code seems to look the same as in the post above - except I changed the names. Here is the error I am getting. Can someone help me understand what I am doing wrong? 12-23 00:50:49.762: ERROR/AndroidRuntime(608): Caused by: java.lang.ClassCastException: android.app.Application It crashes on the following statement: GameData newGameData = ((GameData)getApplicationContext()); Here is my code: package mrk.examples.StaticGameData; import android.app.Application; public class GameData extends Application { private int intTest; GameData () { intTest = 0; } public int getIntTest(){ return intTest; } public void setIntTest(int value){ intTest = value; } } // My main activity package mrk.examples.StaticGameData; import android.app.Activity; import android.content.Intent; import android.os.Bundle; import android.util.Log; public class StaticGameData extends Activity { int intStaticTest; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); GameData newGameData = ((GameData)getApplicationContext()); newGameData.setIntTest(0); intStaticTest = newGameData.getIntTest(); Log.d("StaticGameData", "Well: IntStaticTest = " + intStaticTest); newGameData.setIntTest(1); Log.d("StaticGameData", "Well: IntStaticTest = " + intStaticTest + " newGameData: " + newGameData.getIntTest()); Intent intentNew = new Intent(this, PassData2Activity.class); startActivity (intentNew); } } // My test Activity to see if it can access the data and its previous state from the last activity package mrk.examples.StaticGameData; import android.app.Activity; import android.os.Bundle; import android.util.Log; public class PassData2Activity extends Activity { @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); GameData gamedataPass = ((GameData)getApplicationContext()); Log.d("PassData2Activity", "IntTest = " + gamedataPass.getIntTest()); } } Below is the relevant portion of my manifest: <application android:icon="@drawable/icon" android:label="@string/app_name"> <activity android:name=".StaticGameData" android:label="@string/app_name"> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> <activity android:name=".PassData2Activity"></activity> </application> <application android:name=".GameData" android:icon="@drawable/icon" android:label="@string/app_name"> </application> Thanks in advance for helping me understand why this code is crashing. Also, if you think this is just the wrong approach to let multiple activities have access to the same data, please give your suggestion. Please keep in mind that I am talking about quite a few variables and some large arrays.

    Read the article

  • Rename multiple files as "Modified Date/Time" using cmd or Powershell

    - by Mehper C. Palavuzlar
    I have hundreds of JPG files in a folder. I want to rename each file so that the file name is replaced with "Modified Date/Time" of that file, namely DD.MM.RRRR.HH.MM.jpg. For example, Before After 001.jpg 11.01.2011.16.58.jpg 002.jpg 12.01.2011.09.32.jpg 003.jpg 14.01.2011.12.41.jpg ... ... Since colon (:) cannot be used in file names, the colon between HH and MM must be replaced with a period. I don't want to use a 3rd party tool. Can you provide me with the code to achieve this in Powershell or command line?

    Read the article

  • Corrupt Windows 7, Chkdsk finds errors continuously after multiple tries

    - by mj_
    My Windows 7 instance is all corrupt on my Dell XPS laptop (my kid hibernated it as it was starting). I got the Windows 7 installation disc and I try to repair the installation. The repair goes through and it says done but it doesn't know if it repaired successfully. It just says reboot. I can also open a command line and I run chkdsk /F manually in there. What's strange is that I've run it 20 times and chkdsk claims that it fixes stuff every time. Eventually it stops and I reboot. Much to my chagrin, my machine is still broken. Chkdsk gives an error at the end that it can't write a log file. I've also run a full diagnostic using the Dell diagnostic tools. The tools say that everything is okay. Any suggestions regarding what I should do next? Any help appreciated. mj

    Read the article

  • Install multiple Linux OS using WUBI

    - by soul
    I want to install Linux Mint, and Ubuntu on my laptop using Wubi installation. I already installed Ubuntu, but when I launch the Wubi installer for Mint. I got this guy: Does it mean to say that I can only install one os using the wubi? Do you know of any solution to this problem, except by installing mint using the usual way. I have XP and Windows 7 installed on the machine. That's why I'm using Wubi so that I won't get into troubles.

    Read the article

  • Multiple servers vs 1 big server performace

    - by pistacchio
    Hi to all! My team of developers has suggested a server structure for an upcoming project we are developing. Our structure is "logical", meaning that the various logical components of the application (it is a distributed one) relies on different servers. Some components are more critical than others and will be subjected to more load. Our proposal was to have 1 server per component but the hardware guys suggested to replace the various machines with a single, bigger one with virtual servers. They're gonna use Blade Servers. Now, I'm not an expert at all, but my question to the guys was: so if we need, for example, 3 2GHz CPU / 2GB RAM machines and you give me 1 machine with 3 2GHz CPUs and 6 GB of RAM it is the same? They told me it is. Is this accurate? What are the advantages or disadvantages of both the solutions? What are the generally accepted best practices? Could you point out some URL reference dealing with the problem? Thank you in advance! EDIT: Some more info. The (internet / intranet) application is already layered. We have some servers on the DMZ that will expose pages to the internet and the databases are on their own machines. What we want to split (and they want to join) are some webservers that mainly expose webservices. One is a DAL that communicates with the database layer, one is our Single Sign On / User Profile application that gets called once per page and one is a clone of what seen on the Internet to be used on our lan.

    Read the article

  • connecting to server with multiple nics in other vlan

    - by Thierry
    I have a windows 2003 server with 3 nics on 3 vlan's (this is in domain 1). nic 1 has a default gateway to my router/firewall (sonicwall). In nic 2 and 3 I have left it empty, because it is advised like that everywhere. Within this domain and VLAN's 1-3 everything works fine. BUT... I have a second domain (domain 2) with a 4th Vlan (all 4 VLAN's connected to the same router/firewall) from which my clients need to access the 2003 server in domain 1 (it's my antivirus management console for both domains). when i ping the server from my vlan4 by it's FQDN, it randomly chooses ip from nic 1, 2 or 3 from my 2003 server. (logically because that server is know in DNS with it's 3 IP-addresses. And that is needed for my VLAN's 1-3) I don't really have a problem with that. BUT, I only get an answer of NIC1 (which sounds logically to me, because it's the only one with a gateway). It is not a router problem, because I'm testing in this phase and ping from vlan4 to any machine in vlan1, 2 or 3 that has 1 nic works just fine. If i add a gateway to nic2 and nic3, I get answer from all 3 nics and this works fine. But I know it's adviced to not do that. Can anyone give me advice in this particular case? Would it really be a problem to add a gateway to nic 2 and 3? They would be pointing to the same router/firewall (only with different ip-address, based on the vlan). Or is there another good solution to fix this problem? Thank's in advance, Thierry.

    Read the article

  • Windows 7 - Remote Desktop - multiple credentials

    - by w-
    Hi, My home network consists of a couple XP machines and a windows 7 box. One of the XP boxes is like a shared server accessed via Remote Desktop. I have an account on there and my gf has another. Previously I was able to save RDC shortcuts to this shared server including credentials. So in order to access the server from a computer on the network, i just needed to open the shortcut. I thus would have two shortcuts, one for my account and one for my gf's. In windows 7, it seems to store credentials for a box based on machine name. i.e. i can only store one set of credentials per machine name. This seems incredibly stupid so my question is: In windows 7, is there some way to have RDC shortcuts that use different credentials to login to the same target box? thanks

    Read the article

  • Apache redirect multiple domain names from https

    - by Cyril N.
    My server distribute two main websites, says : www.google.com & www.facebook.com (yeah I know :p) I want them to be distributed via https. Using Apache, I defined a vhost file in sites-available/enabled containing this : <VirtualHost *:80> ServerName google.com Redirect / https://www.google.com/ </VirtualHost> <VirtualHost *:80> ServerName facebook.com Redirect / https://www.facebook.com/ </VirtualHost> <VirtualHost *:80> DocumentRoot /srv/www/google/www/ ServerName www.google.com ServerAlias www.facebook.com <Directory ... /> # Google & Facebook points to the same directory (crazy right ?) # Next of the config </VirtualHost> <VirtualHost *:443> SSLEngine On SSLCertificateFile /path/to/google.crt SSLCertificateKeyFile /path/to/google.key DocumentRoot "/srv/www/google/www/" ServerName www.google.com <Directory .../> # Next of the config </VirtualHost> <VirtualHost *:443> SSLEngine On SSLCertificateFile /path/to/facebook.crt SSLCertificateKeyFile /path/to/facebook.key DocumentRoot "/srv/www/google/www/" ServerName www.facebook.com <Directory .../> # Next of the config </VirtualHost> If I access to https://www.google.com, the httpS works correctly If I access to https://www.facebook.com, the httpS works correctly. If I access to http://www.google.com, the http works correctly # Without https ! If I access to http://www.facebook.com, the http works correctly # Without https ! BUT : If I access to https://facebook.com, it fails saying that the SSL connection is not what expected : Google.com instead of facebook.com Based on my configuration file, I understand why, so I tried to add : <VirtualHost *:443> SSLEngine On ServerName facebook.com Redirect / https://www.facebook.com/ </VirtualHost> But then, I can't even access facebook.com nor www.facebook.com via http/https. So my question is quite simple : how can I redirect all https access to facebook.com (and eventually all sub facebooks : facebook.fr, www.facebook.fr, etc) to www.facebook.com (redirecting to www domain) in HTTPS ? Thanks for your help ! :)

    Read the article

  • Printer offline until spooler service is restarted multiple times

    - by Zian Choy
    When I try to print from my ThinkPad to a printer shared through a Windows 7 Homegroup hosted by a desktop computer, I often have to restart the Print Spooler service several times before the job will go through. In particular, this problem occurs when the desktop is in sleep mode when the print job is started and then brought out of sleep mode after the print job has been kicked off. Both computers are running Windows 7 32-bit edition with the latest patches. I have tried the following with no improvement: SNMP registry hack (see MS KB for details) Following the instructions in a blog post entitled "Sharing Printers on Vista 64-bit" Looking at Printer offline until spooler service is restarted

    Read the article

  • dhclient append settings from multiple DHCP servers

    - by Brian
    I have a server with two interfaces connected to two separate networks, using DHCP for both. When dhclient is writing /etc/resolv.conf, I would like it to append settings that aren't already there. For instance, if I receive from one DHCP server: nameserver 10.0.0.1 search one.mydomain.com and from another: nameserver 10.1.1.254 search two.mydomain.com Then resolv.conf should look like this: search one.mydomain.com two.mydomain.com nameserver 10.0.0.1 nameserver 10.1.1.254 At the moment, it seems the last dhclient overwrites whatever was there. I know I can preconfigure settings in dhclient.conf using supercede or append, but then I have to hard-code the values. I've scoured the man page for dhclient, but it seems like dhclient prefers to work alone (i.e. not in conjunction with any other dhclients)...or am I missing something?

    Read the article

  • How would I go about sharing variables in a C++ class with Lua?

    - by Nicholas Flynt
    I'm fairly new to Lua, I've been working on trying to implement Lua scripting for logic in a Game Engine I'm putting together. I've had no trouble so far getting Lua up and running through the engine, and I'm able to call Lua functions from C and C functions from Lua. The way the engine works now, each Object class contains a set of variables that the engine can quickly iterate over to draw or process for physics. While game objects all need to access and manipulate these variables in order for the Game Engine itself to see any changes, they are free to create their own variables, a Lua is exceedingly flexible about this so I don't forsee any issues. Anyway, currently the Game Engine side of things are sitting in C land, and I really want them to stay there for performance reasons. So in an ideal world, when spawning a new game object, I'd need to be able to give Lua read/write access to this standard set of variables as part of the Lua object's base class, which its game logic could then proceed to run wild with. So far, I'm keeping two separate tables of objects in place-- Lua spawns a new game object which adds itself to a numerically indexed global table of objects, and then proceeds to call a C++ function, which creates a new GameObject class and registers the Lua index (an int) with the class. So far so good, C++ functions can now see the Lua object and easily perform operations or call functions in Lua land using dostring. What I need to do now is take the C++ variables, part of the GameObject class, and expose them to Lua, and this is where google is failing me. I've encountered a very nice method here which details the process using tags, but I've read that this method is deprecated in favor of metatables. What is the ideal way to accomplish this? Is it worth the hassle of learning how to pass class definitions around using libBind or some equivalent method, or is there a simple way I can just register each variable (once, at spawn time) with the global lua object? What's the "current" best way to do this, as of Lua 5.1.4?

    Read the article

  • Corrupted NTFS Drive showing multiple unallocated partitions

    - by volting
    My external hdd with a single NTFS partition was accidentaly plugged out (kids!)... and is now corrupted. Iv tried running ntfsfix - with no luck - output below.. When I look at the disk under disk management in Windows 7 it shows up as having 5 partitions 2 of which are unallocated - none have drive letters and it is not possible to set any (that option and most others are greyed out) - so I can't run chkdsk /f Iv tried using Minitool partition wizard which was mentioned as a solution to another similar question here. It showed the whole drive as one partition, but as unallocated, and the option -- "Check File System" was greyout. Is there anything else I could try ? Output of fdisk -l Disk /dev/sdb: 1500.3 GB, 1500299395072 bytes 255 heads, 63 sectors/track, 182401 cylinders, total 2930272256 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytest I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x69205244 This doesn't look like a partition table Probably you selected the wrong device. Device Boot Start End Blocks Id System /dev/sdb1 ? 218129509 1920119918 850995205 72 Unknown /dev/sdb2 ? 729050177 1273024900 271987362 74 Unknown /dev/sdb3 ? 168653938 168653938 0 65 Novell Netware 386 /dev/sdb4 2692939776 2692991410 25817+ 0 Empty Partition table entries are not in disk order Output of ntfsfix me@vaio:/dev$ sudo ntfsfix /dev/sdb Mounting volume... ntfs_mst_post_read_fixup_warn: magic: 0xffffffff size: 1024 usa_ofs: 65535 usa_count: 65534: Invalid argument Record 0 has no FILE magic (0xffffffff) Failed to load $MFT: Input/output error FAILED Attempting to correct errors... ntfs_mst_post_read_fixup_warn: magic: 0xffffffff size: 1024 usa_ofs: 65535 usa_count: 65534: Invalid argument Record 0 has no FILE magic (0xffffffff) Failed to load $MFT: Input/output error FAILED Failed to startup volume: Input/output error Checking for self-located MFT segment... ntfs_mst_post_read_fixup_warn: magic: 0xffffffff size: 1024 usa_ofs: 65535 usa_count: 65534: Invalid argument OK ntfs_mst_post_read_fixup_warn: magic: 0xffffffff size: 1024 usa_ofs: 65535 usa_count: 65534: Invalid argument Record 0 has no FILE magic (0xffffffff) Failed to load $MFT: Input/output error Volume is corrupt. You should run chkdsk. Options available with MiniTool: Related questions: How to fix a damaged/corrupted NTFS filesystem/partition without losing the data on it? Repair corrupted NTFS File System

    Read the article

  • Hyper-V with multiple physical networks

    - by Yaman
    I have Hyper-V on my laptop with a wireless and Ethernet card, sometimes I connect using wireless card, and sometimes using the Ethernet cable. I am trying to configure Hyper-v to work always with internet provided to virtual machine. I have tried to create 2 virtual switches, one external to the Wireless network card, and the other one external to the Ethernet card. What happens is that the wireless network creates a bridge object in the network and sharing center\Network connections of windows 8, while the Ethernet does not. Unfortunately, they do not work together as external, i have to set the connected one to external and the other one as external, I also have to go to the properties of the bridge and virtual Ethernet properties to uncheck and check some components like: Client for Microsoft Networks Deterministic Network Enhancer VMWare Bridge Protocol QoS Packet Scheduler File and Printer Sharing for Microsoft Networks In order to make things work. Sometimes I keep the wireless network switch external, and go to another location (another wireless network), and it disconnects, i have to reconfigure the switches. Is there a way to do the configuration once and remain working wherever I connect, whether its Wireless or Ethernet and on any network with DHCP?

    Read the article

  • How to synchonise address books across multiple PC's

    - by Rob
    I am using Thunderbird 15.0.1 on two different Windows computers (Vista and Win 7). I maintain the email list for a local club, and often need to send out emails to all members from my home and office Pc's. I use a Thunderbird Address Book to maintain a list of club members and would like any changes made to that list to synchronise to both machines. Can anyone recommend a good way of doing this? I've found a couple of extensions that claim to do this, but both seem to have review that suggest they no longer function correctly since they are quite old extensions. Is there another tool that I would be better off using? Rob

    Read the article

  • RouterLess, house-wired network using multiple powerline adapters

    - by Cliff Arnell
    related to the 'old days' of one ethernet cable tapped with Ts for each monitor.... my question might be very simple... or not. I have an over-the-air internet provider with a wire dish with a powered transceiver and cat5 cable out of the providers supplied modem. I'm presently connecting the output of the modem into my wireless router which sends the internet signal all over the house. Standard stuff, I believe. My Question. Can I just connect the output of the modem into 1 powerline adapter and tie all my equipment such as computer, printer, laptop, Tivo recorder, etc. into 1-each local powerline adapters located near each devices resulting in a 'house-wired' network and no router? I'm bothered by the idea that my over-the-air provider might be using something in my router to establish and keep my IP connection alive. I did have to configure the router for my IP, a router which, in my proposed scenario, would no longer exist. Thank you for your help.

    Read the article

  • 389 DS Achitecture for Multiple Sites

    - by Kyle Flavin
    I'm looking to deploy 389 Directory in my environment to replace an existing iPlanet installation. I would be using it primarily to store user account data for authentication purposes. I have two physically separate data centers that I would like to share the same directory tree. My initial thinking is to setup 389 DS as follows: -A Master/Consumer in DataCenter A -A Master/Consumer in DataCenter B -Replication agreement between both masters, to mirror the directory tree in both environments. Does this sound like a reasonable approach? Is there a better way to do it? (ie: four masters?) Is there documentation for best practices when setting up 389 DS in situations such as this? Thanks.

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >