Search Results

Search found 11930 results on 478 pages for 'shared machines'.

Page 42/478 | < Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >

  • node.js on multi-core machines

    - by zaharpopov
    node.js looks interesting BUT... I must miss something - isn't node.js tuned only to run on a single process & thread? Then how does it scale for multi-core CPUs and multi-CPU servers? After all, it is all great to make fast as possible single-thread server, but for high loads I would want to use several CPUs. And the same goes for making applications faster - seems today the way is use multiple CPUs and parallelize the tasks. How does node.js fit into this picture? Is its idea to somehow distribute multiple instances or what?

    Read the article

  • Storing VMware virtual machines in external harddisk

    - by Jeffery Ang
    Any issue with that? I recently setup ubunutu virtual machine on my external harddisk. Once i accidentally removed the external harddisk usb connection then when i try to startup the ubuntu virtual machine again i get kernel panic. I thinking of writing a cron script to backup everyday to another location to prevent this issue

    Read the article

  • How are interrupts handled by dual processor machines?

    - by jeffD
    I have an idea of how interrupts are handled by a dual core CPU. I was wondering about how interrupt handling is implemented on a board with more than one physical processor. Is any of the interrupt responsibility determined by the physical board's configuration? Each processor must be able to handle some types of interrupts, like disk I/O. Unless there is some circuitry to manage and dispatch interrupts to the appropriate processor? My guess is that the scheme must be processor neutral, so that any processor and core can run the interrupt handler. If a core is waiting on a disk read, will that core be the one to run the interrupt handler when the disk is ready?

    Read the article

  • Virtual machines on USB external disk

    - by Cris
    Hello, i'm thinking about installing (using Sun Virtual Box) Ubuntu 10.04 and Windows7 on external disk connected to USB port of my new MacBook; do you think perfomances will be terrific? Have you tried something similar? Thanks in advance ! c.

    Read the article

  • Running virtual machines: Linux vs Windows 7

    - by vikp
    Hi, I have tried running windows xp development virtual machine under windows 7 and the performance was dreadful. I'm considering installing Linux and running the virtual machine from the Linux, but I'm not sure whether I can expect any performance gains? It's a 2.4ghz core 2 duo machine with 4gb ram and 5400 rpm hdd. Can somebody please recommend very cut down version of linux that can run VMWare player and isn't resource hungry? Thank you

    Read the article

  • ruby on rails state machines

    - by srboisvert
    I'm looking to implement a state machine to manage a user moving through a series of steps over an extended period of time (weeks) with emails and then they interact with the app. I've looked at a couple of AASM plugins and forks (it seems like this plugin space has become a bit chaotic) and am curious what people would recommend. I saw the automatic AASM by hashrocket, that transitions states using cron, and from the title it looks like it might fit the bill but there doesn't appear to be any documentation anywhere and it looks more like a skeleton app than a plugin.

    Read the article

  • `:Zone.Identifier` files keep on appearing in Windows XP virtual machine

    - by Jonathan Reno
    I have a Windows XP Home Edition guest and a Linux Mint 13 host. I use VirtualBox and the ~/Public directory is shared with the guest. It sometimes happens that I use IE on the guest system to download files (until I get a better Windows browser). All of the downloaded files go the the L:\ drive (the ~/Public directory). When they are finished downloading, Windows Explorer adds a :Zone.Identifier file for each file I download. When I extract a downloaded ZIP archive on the guest (on drive L:\), Windows creates a :Zone.Identifier file for every file in the extracted directory. This even occurs if I use the host to move a file to the ~/Public directory. The shared ~/Public directory is on an ext4 partition and the colon character is supposed to be illegal in file names in Windows, but not on the ext4 partition. Is there any way to stop Windows from putting all this rubbish on my filesystem? (I might have to create a shell script to clean up after Windows' act.) Here is what I see in Windows Explorer: By the way, if I were running a Mac OS X host (where colons are illegal file name characters) this would be even more horrendous.

    Read the article

  • Memory Bandwidth Performance for Modern Machines

    - by porgarmingduod
    I'm designing a real-time system that occasionally has to duplicate a large amount of memory. The memory consists of non-tiny regions, so I expect the copying performance will be fairly close to the maximum bandwidth the relevant components (CPU, RAM, MB) can do. This led me to wonder what kind of raw memory bandwidth modern commodity machine can muster? My aging Core2Duo gives me 1.5 GB/s if I use 1 thread to memcpy() (and understandably less if I memcpy() with both cores simultaneously.) While 1.5 GB is a fair amount of data, the real-time application I'm working on will have have something like 1/50th of a second, which means 30 MB. Basically, almost nothing. And perhaps worst of all, as I add multiple cores, I can process a lot more data without any increased performance for the needed duplication step. But a low-end Core2Due isn't exactly hot stuff these days. Are there any sites with information, such as actual benchmarks, on raw memory bandwidth on current and near-future hardware? Furthermore, for duplicating large amounts of data in memory, are there any shortcuts, or is memcpy() as good as it will get? Given a bunch of cores with nothing to do but duplicate as much memory as possible in a short amount of time, what's the best I can do?

    Read the article

  • Account sharing among Ubuntu machines

    - by muckabout
    I'd like a simple and secure system to have allow users in our network to have their account (e.g., 'myname') work on every machine in the network (e.g., such that they could ssh to any machine and have the same userid, mounted smb share). Any suggestions?

    Read the article

  • State machines in C#

    - by Sir Psycho
    Hi, I'm trying to work out what's going on with this code. I have two threads iterating over the range and I'm trying to understand what is happening when the second thread calls GetEnumerator(). This line in particular (T current = start;), seems to spawn a new 'instance' in this method by the second thread. Seeing that there is only one instance of the DateRange class, I'm trying to understand why this works. Thanks in advance. class Program { static void Main(string[] args) { var daterange = new DateRange(DateTime.Now, DateTime.Now.AddDays(10), new TimeSpan(24, 0, 0)); var ts1 = new ThreadStart(delegate { foreach (var date in daterange) { Console.WriteLine("Thread " + Thread.CurrentThread.ManagedThreadId + " " + date); } }); var ts2 = new ThreadStart(delegate { foreach (var date in daterange) { Console.WriteLine("Thread " + Thread.CurrentThread.ManagedThreadId + " " + date); } }); Thread t1 = new Thread(ts1); Thread t2 = new Thread(ts2); t1.Start(); Thread.Sleep(4000); t2.Start(); Console.Read(); } } public class DateRange : Range<DateTime> { public DateTime Start { get; private set; } public DateTime End { get; private set; } public TimeSpan SkipValue { get; private set; } public DateRange(DateTime start, DateTime end, TimeSpan skip) : base(start, end) { SkipValue = skip; } public override DateTime GetNextElement(DateTime current) { return current.Add(SkipValue); } } public abstract class Range<T> : IEnumerable<T> where T : IComparable<T> { readonly T start; readonly T end; public Range(T start, T end) { if (start.CompareTo(end) > 0) throw new ArgumentException("Start value greater than end value"); this.start = start; this.end = end; } public abstract T GetNextElement(T currentElement); public IEnumerator<T> GetEnumerator() { T current = start; do { Thread.Sleep(1000); yield return current; current = GetNextElement(current); } while (current.CompareTo(end) < 1); } System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator() { return GetEnumerator(); } }

    Read the article

  • Issue with RegConnectRegistry connecting to 64 bit machines

    - by RA
    I'm seeing a weird thing when connecting to the performance registry on 64 bit editions of Windows. The whole program stalls and callstacks becomes unreadable. After a long timeout, the connection attempts aborts and everything goes back to normal. The only solution is to make sure that only one thread at the time queries the remote registry, unless the remote machine is a 32 bit Windows XP, 2003, 2000 , then you can use as many threads as you like. Have anyone a technical explanation why this might be happening ? I've spent 2-3 days searching the web without coming up with anything. Here is a test program, run it first with one thread (connecting to a 64 bit Windows), then remove the comment in tmain and run it with 4 threads. Running it with one thread works as expected, running with 4, returns ERROR_BUSY (dwRet == 170) after stalling for a while. Remember to set a remote machine correctly in RegConnectRegistry before running the program. #define TOTALBYTES 8192 #define BYTEINCREMENT 4096 void PerfmonThread(void *pData) { DWORD BufferSize = TOTALBYTES; DWORD cbData; DWORD dwRet; PPERF_DATA_BLOCK PerfData = (PPERF_DATA_BLOCK) malloc( BufferSize ); cbData = BufferSize; printf("\nRetrieving the data..."); HKEY hKey; DWORD dwAccessRet = RegConnectRegistry(L"REMOTE_MACHINE",HKEY_PERFORMANCE_DATA,&hKey); dwRet = RegQueryValueEx( hKey,L"global",NULL,NULL,(LPBYTE) PerfData, &cbData ); while( dwRet == ERROR_MORE_DATA ) { // Get a buffer that is big enough. BufferSize += BYTEINCREMENT; PerfData = (PPERF_DATA_BLOCK) realloc( PerfData, BufferSize ); cbData = BufferSize; printf("."); dwRet = RegQueryValueEx( hKey,L"global",NULL,NULL,(LPBYTE) PerfData,&cbData ); } if( dwRet == ERROR_SUCCESS ) printf("\n\nFinal buffer size is %d\n", BufferSize); else printf("\nRegQueryValueEx failed (%d)\n", dwRet); RegCloseKey(hKey); } int _tmain(int argc, _TCHAR* argv[]) { _beginthread(PerfmonThread,0,NULL); /* _beginthread(PerfmonThread,0,NULL); _beginthread(PerfmonThread,0,NULL); _beginthread(PerfmonThread,0,NULL); */ while(1) { Sleep(2000); } }

    Read the article

  • how random is Math.random() in java across different jvms or different machines

    - by user881480
    I have a large distributed program across many different physical servers, each program spawns many threads, each thread use Math.random() in its operations to draw a piece from many common resource pools. The goal is to utilize the pools evenly across all operations. Sometimes, it doesn't appear so random by looking at a snapshot on a resource pool to see which pieces it's getting at that instant (it might actually be, but it's hard to measure and find out for sure). Is there something that's better than Math.random() and performs just as good (not much worse at least)?

    Read the article

  • WCF Fails when using impersonation over 2 machine boundaries (3 machines)

    - by MrTortoise
    These scenarios work in their pieces. Its when i put it all together that it breaks. I have a WCF service using netTCP that uses impersonation to get the callers ID (role based security will be used at this level) on top of this is a WCF service using basicHTTP with TransportCredientialOnly which also uses impersonation I then have a client front end that connects to the basicHttp. the aim of the game is to return the clients username from the netTCP service at the bottom - so ultimatley i can use role based security here. each service is on a different machine - and each service works when you remove any calls they make to other services when you run a client for them both locally and remotley. IE the problem only manifests when you jump accross more than one machine boundary. IE the setup breaks when i connect each part together - but they work fine on their own. I also specify [OperationBehavior(Impersonation = ImpersonationOption.Required)] in the method and have IIS setup to only allow windows authentication (actually i have ananymous enabled still, but disabling makes no difference) This impersonation works fine in the scenario where i have a netTCP Service on Machine A with a client with a basicHttp service on machine B with a clinet for the basicHttp service also on machine B ... however if i move that client to any machine C i get the following error: The exception is 'The socket connection was aborted. This could be caused by an error processing your message or a receive timeout being exceeded by the remote host, or an underlying network resource issue. Local socket timeout was '00:10:00'' the inner message is 'An existing connection was forcibly closed by the remote host' Am beginning to think this is more a network issue than config ... but then im grasping at straws ... the config files are as follows (heading from the client down to the netTCP layer) <?xml version="1.0" encoding="utf-8" ?> <configuration> <system.serviceModel> <bindings> <basicHttpBinding> <binding name="basicHttpBindingEndpoint" closeTimeout="00:02:00" openTimeout="00:02:00" receiveTimeout="00:10:00" sendTimeout="00:02:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferSize="65536" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered" useDefaultWebProxy="true"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <security mode="TransportCredentialOnly"> <transport clientCredentialType="Windows" proxyCredentialType="None" realm="" /> <message clientCredentialType="UserName" algorithmSuite="Default" /> </security> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="http://panrelease01/WCFTopWindowsTest/Service1.svc" binding="basicHttpBinding" bindingConfiguration="basicHttpBindingEndpoint" contract="ServiceReference1.IService1" name="basicHttpBindingEndpoint" behaviorConfiguration="ImpersonationBehaviour" /> </client> <behaviors> <endpointBehaviors> <behavior name="ImpersonationBehaviour"> <clientCredentials> <windows allowedImpersonationLevel="Impersonation"/> </clientCredentials> </behavior> </endpointBehaviors> </behaviors> </system.serviceModel> </configuration> the service for the client (basicHttp service and the client for the netTCP service) <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.web> <compilation debug="true" targetFramework="4.0" /> </system.web> <system.serviceModel> <bindings> <netTcpBinding> <binding name="netTcpBindingEndpoint" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" transactionFlow="false" transferMode="Buffered" transactionProtocol="OleTransactions" hostNameComparisonMode="StrongWildcard" listenBacklog="10" maxBufferPoolSize="524288" maxBufferSize="65536" maxConnections="10" maxReceivedMessageSize="65536"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <reliableSession ordered="true" inactivityTimeout="00:10:00" enabled="false" /> <security mode="Transport"> <transport clientCredentialType="Windows" protectionLevel="EncryptAndSign" /> <message clientCredentialType="Windows" /> </security> </binding> </netTcpBinding> <basicHttpBinding> <binding name="basicHttpWindows"> <security mode="TransportCredentialOnly"> <transport clientCredentialType="Windows"></transport> </security> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="net.tcp://5d2x23j.panint.com/netTCPwindows/Service1.svc" binding="netTcpBinding" bindingConfiguration="netTcpBindingEndpoint" contract="ServiceReference1.IService1" name="netTcpBindingEndpoint" behaviorConfiguration="ImpersonationBehaviour"> <identity> <dns value="localhost" /> </identity> </endpoint> </client> <behaviors> <endpointBehaviors> <behavior name="ImpersonationBehaviour"> <clientCredentials> <windows allowedImpersonationLevel="Impersonation" allowNtlm="true"/> </clientCredentials> </behavior> </endpointBehaviors> <serviceBehaviors> <behavior name="WCFTopWindowsTest.basicHttpWindowsBehaviour"> <!-- To avoid disclosing metadata information, set the value below to false and remove the metadata endpoint above before deployment --> <serviceMetadata httpGetEnabled="true" /> <!-- To receive exception details in faults for debugging purposes, set the value below to true. Set to false before deployment to avoid disclosing exception information --> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> </behaviors> <services> <service name="WCFTopWindowsTest.Service1" behaviorConfiguration="WCFTopWindowsTest.basicHttpWindowsBehaviour"> <endpoint address="" binding="basicHttpBinding" bindingConfiguration="basicHttpWindows" name ="basicHttpBindingEndpoint" contract ="WCFTopWindowsTest.IService1"> </endpoint> </service> </services> <serviceHostingEnvironment multipleSiteBindingsEnabled="true" /> </system.serviceModel> <system.webServer> <modules runAllManagedModulesForAllRequests="true" /> <directoryBrowse enabled="true" /> </system.webServer> </configuration> then finally the service for the netTCP layer <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.web> <authentication mode="Windows"></authentication> <authorization> <allow roles="*"/> </authorization> <compilation debug="true" targetFramework="4.0" /> <identity impersonate="true" /> </system.web> <system.serviceModel> <bindings> <netTcpBinding> <binding name="netTCPwindows"> <security mode="Transport"> <transport clientCredentialType="Windows"></transport> </security> </binding> </netTcpBinding> </bindings> <services> <service behaviorConfiguration="netTCPwindows.netTCPwindowsBehaviour" name="netTCPwindows.Service1"> <endpoint address="" bindingConfiguration="netTCPwindows" binding="netTcpBinding" name="netTcpBindingEndpoint" contract="netTCPwindows.IService1"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="mextcp" binding="mexTcpBinding" contract="IMetadataExchange"/> <host> <baseAddresses> <add baseAddress="net.tcp://localhost:8721/test2" /> </baseAddresses> </host> </service> </services> <behaviors> <serviceBehaviors> <behavior name="netTCPwindows.netTCPwindowsBehaviour"> <!-- To avoid disclosing metadata information, set the value below to false and remove the metadata endpoint above before deployment --> <serviceMetadata httpGetEnabled="false" /> <!-- To receive exception details in faults for debugging purposes, set the value below to true. Set to false before deployment to avoid disclosing exception information --> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> </behaviors> <serviceHostingEnvironment multipleSiteBindingsEnabled="true" /> </system.serviceModel> <system.webServer> <modules runAllManagedModulesForAllRequests="true" /> <directoryBrowse enabled="true" /> </system.webServer> </configuration>

    Read the article

  • Why do hosts prefer Linux to Windows Server?

    - by iconiK
    So far I see a HUGE majority of hosts provide only Linux shared hosting, providing Windows only to VPS (or even to only dedicated servers). Why is it so? While Windows is a lot more expensive than Linux (though it depends on a lot of factors, not just initial and support license cost), it also provides ASP.NET, IIS and of course, Microsoft SQL Server. I know in the past it might have been because of cPanel being Linux only but now they have a Windows version. But still, why is Linux predominantly used on shared hosting? PHP works on both systems. IIS can be (and probably is) faster. MySQL runs on both systems as well. cPanel has a Windows version. Python, Perl, Ruby, all run on Windows as well. You even have MS SQL Server Express, which I find more superior than MySQL in both speed and features. Access is there for low usage requirements, as is SQLite (which is so great for quick small stuff). And with PowerShell you have a good alternative to the Unix shell. EDIT: I am looking for common reasons, I realize each hosting company (and/or it's clients) may have different needs. This becomes very important when you get to VPS or Cloud which give you a full operating system to use.

    Read the article

  • How to connect to Oracle 10g server from client machines

    - by Tareq
    I have installed Oracle 10g in one of my office's computer. I want to keep this as database server. I am developing a .net project which will communicate with the database server from client machine and from the server machine. I success to communicate with oracle from server machine but not from client machine using the .net project. The connection code is as follows: Public OraConn As ADODB.Connection OraConn = New ADODB.Connection OraConn.Provider = "OraOLEDB.Oracle" OraConn.ConnectionString = "Data Source=<my_database_name>;User ID=<my_user>;Password=<my_pass>;" OraConn.Open() Please tell me step by step procedures how can I connect to my server database from my .net client program resides on client machine ? Thanks in Advance.

    Read the article

  • Can a website company that builds 4-5 websites a year afford dedicated hosting?

    - by Petras
    We manage about 30 websites that use shared ASP.NET SQL Server web hosting. These are typical small/medium business websites and they perform fine in this environment. Recently I was looking at VPS hosting in this thread http://serverfault.com/questions/128329/how-do-you-host-multiple-public-facing-websites-on-a-vps After contacting a provider in one of the replies I was told that VPS hosting is not recommended for 30 sites, even if they are small. The resource requirements might be too great even for VPS. So I should turn to dedicated hosting. The lowest cost dedicated hosting is $219 per month (see http://www.serverintellect.com/dedicated/pentiumdservers.aspx). But this is only for a single processor which seems too light for a machine running both IIS and SQL. In our office all the developers work on quad cores so I assume I’d really need the Quad Processor. However, this starts at $599 monthly. Now, I won’t be able to transfer all of our 30 sites to this machine. I’d only be able to transfer say 5 or 6. However, moving forward, I’d be able to host all future sites on this machine. This amounts to 4-5 per year. Let’s look at the economics. Shared hosting costs are typically $16.95 monthly (see http://www.crystaltech.com/dotnet.aspx). So here’s the dilemma First months costs: $599 First month revenue: 6x$16.95 = $101.7 Loss in first month: $497.3 First year costs: $599x12=$7188 First month revenue: 6x$16.95x12 + 5x$16.95x6(averaged) = $1728.9 Loss in first year: $5459.1 Clearly it is going to take years for this server to pay for itself. It just doesn’t seem economical! Am I missing something here, or is dedicated not the way to go with the amount of sites we build?

    Read the article

  • Developing on both Windows & Linux machines simultaneously

    - by Jamie
    Sorry for the bad title (couldn't think of a better way to describe it) I have a windows machine which I do development on. However, I have a new project which needs to interact with a linux system (executing linux commands etc.). So, obviously I can't do development on my windows machine..and I don't wish to code on the dev machine, svn commit and then svn update it on the linux machine. Is there a way where any changes I make on my dev machine will be quickly mirrored to the linux machine? SVN is not a very quick alternative and of course some changes will be very minor. Any ideas? A network share I guess....but that's not very pretty (bit slow too). As fellow developers I would like to know if you've been in a similar situation and how you've resolved it. On a furthernote, I can't just install Ubuntu as my development machine and mirror the commands, applications etc. from the linux machine because it's a cluster 'master' machine and so therefore it has quite a special configuration. Thanks guys! EDIT: I've also thought about having web services on the linux machine and then just calling them from code thus seperating platform development dependency. What do you think about that too? thanks

    Read the article

  • Session variables not getting set but only in Internet Explorer and not on all machines

    - by gaoshan88
    Logging into a site I'm working on functions as expected on my local machine but fails on the remote server but ONLY in Internet Explorer. The kicker is that it works in IE locally, just not on the remote machine. What in the world could cause this? I have stepped through the code on the remote machine and can see the entered login values being checked in the database, they are found and then a login function is called. This sets two $_SESSION variables and redirects to the main admin page. However, in IE only (and not when run on local machine... this is key) the $_SESSION variables are not present by the time you get to the main admin page. var_dump($_SESSION) gives me what I expect on every browser when I am running this in my local environment and in every browser except IE 6, 7 and 8 when run on the remote server (where I get a null value as if nothing has been set for $_SESSION). This really has me stumped so any advice is appreciated. For an example... in IE, run locally, var_dump gives me: array 'Username' => string 'theusername' length=11 'UserID' => string 'somevalue' length=9 Run on the remote server (IE only... works fine in other browsers) var_dump gives me: array(0){} Code: $User = GetUser($Username, $Password); if ($User->UserID <> "") { // this works so we call Login()... Login($User); // this also works and gives expected results. on to redirect... header("Location: index.php"); // a var_dump at index.php shows that there is no session data at all in IE, remotely. } else { header("Location: login.php"); } function Login($data) { $_SESSION['Username'] = $data->Username; $_SESSION['UserID'] = $data->UserID; // a var dump here gives the expected data in every browser }

    Read the article

  • Easy way to replicate web page across machines?

    - by Mike_G
    I am trying to replicate a browser page to another browser on another machine. I basically want to reproduce a page exactly how it appears to a customer for viewing by the website owner. I have done this before using some impersonation trickery, but found that it would throw the session state out of wack when the site owner would switch customers. So I would like to stay away from cookie and authentication manipulation. Anybody done anything like that? Is there a way to easily transfer the DOM to a webservice? The tech/programming at my disposal are C#, javascript, WCF.

    Read the article

  • Git repos over multiple machines - backups and keeping in sync

    - by a-or-b
    I'm new to git so please feel free to RTFM me... I have multiple development sites (none of which can communicate via a network with each other) and am working on a few projects (with a few people) at any one time. What I would ideally have is at each site a centralized repository that can be pulled from but development would occur in our own (personal) repos. Then I would like to be able to sync across the centralized repos (via USB key for example). I want a centralized repo at each location as (1) I'm new to git and do break my (personal) local repo by playing around and (2) some projects get put on hold so I want to be able to free up disk space by deleting them. This is the "backup" part of my question. I was also hoping to be able to use 'git clone --bare' for my centralized repos (and the USB key repos to?) as we don't need the full checkout, just the git benefits. However I can't seem to get a bare repo to work as repo I can push from. I've used 'git remote' to set up an remote origin (similar to http://toolmantim.com/thoughts/setting_up_a_new_remote_git_repository) but I can't get 'git push' to work - it seems I need a checked-out repo. . Does anyone else use this sort of repo/development structure or is there something fundamental about git usage that I'm missing? . A solution that I thought about that might not work - If I had a 'git clone --bare' at each site and then use a git repo on my removable media which has remotes set up for each site then I could ('pull') sync my USB key with each repo. But then can I update the site repo from my USB key? Could I push from USB?

    Read the article

  • sharing a USB printer in SOHO environment [migrated]

    - by Registered User
    Here is a situation I am facing, there is USB printer which works only on a Windows XP machine, there are other devices in LAN it is a Small Office Home Office environment. How can this USB printer attached to Windows XP machine be shared so that other laptops or users in Network who have Windows 7 or Linux on their laptops can use this printer. The printer model number is Canon Laser Shot LBP-1210 http://www.canon-europe.com/For_Home/Product_Finder/Printers/Laser/LaserShot_LBP1210/index.asp a print server is not available to me I need to make it work in this situation only.What can I do? the clients are unable to connect to this.It is not a network or TCP/IP printer If a from Windows 7 machine some one wants to use this printer so that he can take a print he gets an error while adding the printer to his machine which is a Windows 7 machine (where as the printer is USB printer on Windows XP machine) Start--->Devices and Printers---> Add Printer---> Find Printer by name or IP address--->Selected a shared printer by name-->\\PC-Name-printer3 and select browse it gives a message Windows can not find a driver for Canon LASER SHOT LBP-1210 on the network what does this mean do I need to install some kind of software at client machine or on the machine where printer is present?

    Read the article

  • Compare DateTime ticks on two machines

    - by vani
    Is it a viable option to compare two FileInfo.CreationTimeUtc.Ticks of two files on two different computers to see which version is newer - or is there a better way? Do Ticks depend on OS time or are they really physical ticks from some fixed date in the past?

    Read the article

  • Synchronizing time between two Windows 7 machines connected with a LAN cable

    - by Markus Roth
    I have a number laptops that run our application while connected to each other in pairs with an ethernet cable, but not connected to any external network or the internet. T I need the connected pair to synchronize their system times, but since every computer needs to be able to synch with any other computer, I can't define one computer to be a time-server and the other to be a client. Is there a way to do this with NTP? Or some other way?

    Read the article

< Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >