Search Results

Search found 6172 results on 247 pages for 'limit choices to'.

Page 139/247 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • Add entire 300 GB filesystem to Git Annex repository?

    - by Ryan Lester
    By default, I get an error that I have too many open files from the process. If I lift the limit manually, I get an error that I'm out of memory. For whatever reason, it seems that Git Annex in its current state is not optimised for this sort of task (adding thousands of files to a repository at once). As a possible solution, my next thought was to do something like: cd / find . -type d | git annex add --$NONRECURSIVELY find . -type f | git annex add # Need to add parent directories of each file first or adding files fails The problem with this solution is that there doesn't seem from the documentation to be a way to non-recursively add a directory in Git Annex. Is there something I'm missing or a workaround for this? If my proposed solution is a dead end, are there other ways that people have solved this problem?

    Read the article

  • What do I need to know and how do I backup a recovery partition?

    - by PeanutsMonkey
    I am in the possession of a HP laptop specifically the HP Folio Ultrabook. I need to make an image of the harddrive so that in the event it needs to be restored I can do so with the base operating system which is Windows 7 Professional as well as the HP recovery partition. I also need to backup all data that is on the laptop. Where do I start? What software can I use? Please limit these to freely available software or Linux I need to be able to backup the image to a file server and an external harddrive Is there anything else I need to do or know? The laptop is being used by a user on a domain

    Read the article

  • Fatal error: Out of memory (allocated ...) (tried to allocate ... bytes) not due to memory_limit setting

    - by Lorenz Meyer
    Since a few days, I get the following error on my server: Fatal error: Out of memory (allocated 262144) (tried to allocate 393216 bytes) Usually this error is due to a memory consumption that is exceeding the configured memory_limit, but in my case there is no relation. The memory_limit is set to 128MB, and in this case, we not even reach 1MB. Also the server does not have a big load, in fact it is an intranet server, and there are just a few people conected to it. System: Windows Server 2003, 1Go RAM, only 600 MB used. Apache 2.2.4 PHP 5.2.3 This error is appearing randomly. The memory limit reached also is randomly between a few kB to a few MB. Sometimes restarting Apache is required to get rid of the error, sometimes it disapears itself. Restarting Apache or the entire server helps temporarily. Where could this problem come from ? How could I narrow down the error source ?

    Read the article

  • What's the deal with NTFS tags in windows 7

    - by polarix
    So back in the days of 'longhorn' there was this WinFS idea which was both cool looking and scary looking. Then it seemed to disappear, but we were told that many of the concepts would be rolled into Vista. Then maybe Win7. Anyway, nowadays if you look at a win7 Explorer window, you can have columns that have a lot of tag-based info about a file (right click on column header-more...), including one called "tags". Is this something in NTFS that can be modified per-file somehow? Is its GUI hiding, or is this something that's infinitely-delayed, or is it just a figment of my imagination? Sure would be nice to be able to get around the NTFS path 256 character limit for searches, and to filter file folders per Excel 2007.

    Read the article

  • supervise/daemontools conflicts with apache -D FOREGROUND

    - by Kevin G.
    Hoping that somebody can help us understand this behavior. We've got a bunch of daemontools services under /etc/service/. One of the services controls apache, and the run script has this in it. exec envdir /var/lib/supervise/wwwproxy/env setuidgid root bash <<-BASH ulimit -n 8192 # also increase the running user's file descriptor limit exec apache2 -f /path/to/demo_apache2.conf -D FOREGROUND BASH We were having the problem that svc -d /etc/service/* actually had the effect of restarting all the services, it didn't take them down. We finally tracked it down to that one service, and found that svc -d /etc/service/apache2 would bring up any other service was down, including itself. Changing FOREGROUND to NO_DAEMONIZE fixes the behavior, but we'd really like to understand what's going on. Can anybody explain why an svc -d on one service would bring an other service up? Thanks for any clue you can offer.

    Read the article

  • Unable to execute native sql query

    - by Renjith
    I am developing an application with Spring and hibernate. In the DAO class, I was trying to execute a native sql as follows: SELECT * FROM product ORDER BY unitprice ASC LIMIT 6 OFFSET 0 But the system throws an exception. org.hibernate.HibernateException: No Hibernate Session bound to thread, and configuration does not allow creation of non-transactional one here org.springframework.orm.hibernate3.SpringSessionContext.currentSession(SpringSessionContext.java:63) org.hibernate.impl.SessionFactoryImpl.getCurrentSession(SessionFactoryImpl.java:544) com.dao.ProductDAO.listProducts(ProductDAO.java:15) com.dataobjects.impl.ProductDoImpl.listProducts(ProductDoImpl.java:26) com.action.ProductAction.showProducts(ProductAction.java:53) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) application-context.xml is show below <bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer" p:location="/WEB-INF/jdbc.properties" /> <bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource" p:driverClassName="${jdbc.driverClassName}" p:url="${jdbc.url}" p:username="${jdbc.username}" p:password="${jdbc.password}" /> <!-- Hibernate SessionFactory --> <!-- class="org.springframework.orm.hibernate3.LocalSessionFactoryBean"--> <bean id="sessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean"> <property name="dataSource"> <ref local="dataSource"/> </property> <property name="configLocation"> <value>WEB-INF/classes/hibernate.cfg.xml</value> </property> <property name="configurationClass"> <value>org.hibernate.cfg.AnnotationConfiguration</value> </property> <!-- <property name="annotatedClasses"> <list> <value>com.pojo.Product</value> <value>com.pojo.User</value> <value>com.pojo.UserLogin</value> </list> </property> --> <property name="hibernateProperties"> <props> <prop key="hibernate.dialect">${hibernate.dialect}</prop> <prop key="hibernate.show_sql">true</prop> </props> </property> </bean> <!-- User Bean definitions --> <bean name="/logincheck" class="com.action.LoginAction"> <property name="userDo" ref="userDo" /> </bean> <bean id="userDo" class="com.dataobjects.impl.UserDoImpl" > <property name="userDAO" ref="userDAO" /> </bean> <bean id="userDAO" class="com.dao.UserDAO" > <property name="sessionFactory" ref="sessionFactory" /> </bean> <bean name="/listproducts" class="com.action.ProductAction"> <property name="productDo" ref="productDo" /> </bean> <bean id="productDo" class="com.dataobjects.impl.ProductDoImpl" > <property name="productDAO" ref="productDAO" /> </bean> <bean id="productDAO" class="com.dao.ProductDAO" > <property name="sessionFactory" ref="sessionFactory" /> </bean> And DAO class is public class ProductDAO extends HibernateDaoSupport{ public List listProducts(int startIndex, int incrementor) { org.hibernate.Session session = getHibernateTemplate().getSessionFactory().getCurrentSession(); String queryString = "SELECT * FROM product ORDER BY unitprice ASC LIMIT 6 OFFSET 0"; List list = null; try{ session.beginTransaction(); org.hibernate.Query query = session.createQuery(queryString); list = query.list(); session.getTransaction().commit(); } catch(Exception e) { e.printStackTrace(); } finally { session.close(); } return list; } public List getProductCount() { String queryString = "SELECT COUNT(*) FROM Product"; return getHibernateTemplate().find(queryString); } } Any thoughts to fix it up?

    Read the article

  • Where would an S3 upload speed cap originate?

    - by CoreyH
    I do a ton of uploading to S3 and am experiencing capped speeds and I can't quite figure out how to address it. The setup: Windows Server 2008 R2 x64, external HD, using a Java based upload tool called Jsh3ll and custom VBS scripts to kick the jobs off. Running one process at a time, I am always limited to about 4mbps. I have FiOS at 35/35mbps speeds, so it isn't an outright limit. AND, I can run parallel instances and can go all the way up to 35mbps, so I know the problem isn't gateway/nic/machine/amazon related. Running parallel instances works to a degree as a solution, but increases the complexity of my workflow greatly. Solving this would make my life dramatically easier. When I was first doing this I was playing around with a bunch of Windows TCP parameters and was able to briefly get unconstrained bandwidth, but it wasn't repeatable. Thoughts?

    Read the article

  • How to restrict access to a specific wireless network to only 1 user profile in Windows 7.

    - by Mathlight
    Hi all, I'm using Win7 SP1. I've got multiple users on the laptop that can / must connect to a wireless network, lets call it Wireless1. I've got an second wireless network, (lets call it Wireless2), which I want to limit access to only the admin user of the laptop. Now I can remove Wireless2 in the network manager every time, but i want a more user friendly solution, so that only the admin can connect to Wireless2, and all the other users cannot ( they may see the network, but must enter the password, like all other networks ). Any ideas?

    Read the article

  • VPN service for 4in6

    - by Deshene
    I have a local network with internet access. But unfortunately IPv4 internet connection speed is limited to 1mbps, which is realy sad. Fortunately I have a native IPv6, and there is no connection speed limit over IPv6. So, in order to get a good internet connection I made a plan: connect to the VPN-service over IPv6, and pass all IPv4 traffic through IPv6 tunnel, or something like that, I think you get the idea. I suggested to use service like HideMyAss.com, but unfortunately they don't support IPv6. The question is: Is there any existing VPN service that will make my dreams come true, and is easy to use, which I could connect over PPTP or OpenVPN (I want to set up connection to VPN in my router settings).

    Read the article

  • ESXi 5.1 - Unable to register host

    - by deanvz
    I download and successfully installed ESXi 5.1. I am however unable to get the licence key I received installed. An error occurred when assigning the specified licence key: The system Memory is not satisfied with the 32 GB of Maximum memory limit. Current with 80.00 GB of Memory. Is there now way around this? A quick google revealed that this is a global problem with no real answer or resolution. The only workaround is to remove the physical RAM chips, but as this is in going to be in production I dont want to do that as it would mean down time when I have to reinsert the memory

    Read the article

  • Configuring sendmail to use one outbound MTA exclusively

    - by Charlie Martin
    I have a sendmail problem, and I'm anything but a sendmail guru -- I could use some help. My problem is that I have a system intended to be more or less an "appliance" -- it's not intended to have an admin. Because of this, it needs to be able to "call home" by sending email. As we have configured it, this works fine -- using sendmail, it finds the appropriate relay by looking up an MX record and everything works fine. Now, however, because of security concerns, we want to limit it to using exactly one relay, so for example relay.corp.example.com. Should the user configure it to use, say, fubar.example.com, the mail sending should fail or be deferred. I thought that by configuring sendmail with a /etc/mail/server.switch file containing hosts files without dns, I'd get that effect. This doesn't work -- instead, if it gets mail addressed to [email protected], it tries to talk directly to example.com, and ignores the configured server. Any ideas?

    Read the article

  • VPN provider for remote access to servers from a known IP address

    - by brentkeller
    My organization has a few servers that are being hosted by a provider and we limit remote access to a whitelist and deny access to any IPs not on the whitelist. We would like to find a hosted VPN service that we can connect to that would give us a known IP that we could add to our whitelist and gain access to the servers while on the road. Does anyone know of any such services? I don't think we can just setup the VPN built in to Windows Server since the servers are hosted. Any suggestions would be appreciated.

    Read the article

  • Deployment from OVA format

    - by Manvendra Bele
    I am deploying a VM using a OVA format. The size of OVA format is 57 GB. Currently free space on my datastore is 388 GB. At the time of selecting Disk Format type if shows me in red that the disk size required is 1 TB therefore you cannot select THICK provisioning. Therefore, i selected THIN provisiong. It THIN provisioing i am showed that Estimated Disk Usage is 112 GB which is less than the free space available. But even after selecting THIN proviosing at the time of deployment it throws an error that it cannot create disk as the size of disk is larger than the maximum specified limit. My block size is of 1 MB. Pasting my exact error here: Failed to deploy OVF package:File [datastore1] IMS Tester 1/IMS Tester1_2.vmdk is larger than maximum size supported by datastore 'datastore1

    Read the article

  • Mac Console.app not logging any messages

    - by karl_
    I recently attempted to overcome the 500 message limitation on console logs using the advice provided here: Mac: Extend or disable 500 Messages Limit of Console I copied the PLIST file onto my desktop, made the modifications, and re-copied into the LaunchDaemon folder. No dice. Unfortunately, this also broke logging in general- the console hasn't logged a message since I attempted this switch. I even went back and undid my changes. Still no logs. What's going on? Is there a way to reinstall Console.app, or revert to original settings?

    Read the article

  • Does NTFS performance degrade significantly in volumes larger than five or six TB?

    - by Josh Yeager
    One of my customers is planning to set up a new document store, which will probably grow by 1-2TB per year. One of my co-workers says that Windows performance is extremely bad if it has a single NTFS volume that is bigger than five or six TB. He thinks that we need to set up their system with multiple volumes so that no single volume will exceed that limit. Is this a real problem? Does Windows or NTFS slow down when the volume size reaches several terabytes? Or is it possible to create a single volume of 10 or more TB?

    Read the article

  • Multiple .bkf files created in Backupexec 12.5 or 2010 related to heavy I/O?

    - by syuusuke
    Hey everyone, I was wondering if anyone who has used backupexec 12.5 or 2010 have ever experienced multiple .bkf files created for a single job. To describe what I mean by multiple files, the .bkf are being created with random file sizes under 2GB even though I've assigned the setting to chop the file after 10GB size. Some jobs will create 20x .bkf files in 1 job with file chunks ranging from 50MB to 800MB sizes. Is this is a sign of heavy I/O issues? Bandwidth limitations? I'm not sure, I'm here to seek some advices and suggestions. I've setup another backup server with the same exact settings and they seem to create a new .bkf file when 10GB limit has been reached. Although I am backing up different machines but I know my settings are an exact match to the problematic or atleast I think it's a problem.

    Read the article

  • Office 365 - unable to deactivate

    - by Jake
    We are using Office 365 ProPlus 2013. A new user tried to activate their install and received the error that they had reached their install limit of 5 machines. Upon clicking the link the deactivate previous installs that appears in that error dialog, the user is taken to their Office software management tab. Usually, if the user has previous installs, they are listed here and the user is able to deactivate. However, in this case, previous installs do not appear and it seems something else may be the problem. I am looking for any suggestions as to what may be the problem, thanks.

    Read the article

  • squid ip based authentication

    - by Ian R.
    I have 10 ip's on a VPS and squid3 installed. I want to lease all of them to 10 co-workers. The authentication should be ip-based. Basically I want to allow only their home ip address (not internal - we're not on a network) to connect to my squid. I would also like to offer them a dedicated ip from my outgoing addresses. I managed to get it working using username/password based authentication but some software do not support that feature so I would like to switch to this limit if possible. Any guidance/sample acl's?

    Read the article

  • Can't open Paypal.com with Google Chrome

    - by grunwald2.0
    Currently I always get an error message (since one week!) trying to open the PayPal website with Google Chrome and I don't know why. FlashBlocker and AdBlockPlus are deactivated. v.20.0.1132.11 dev. Error message: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>400 Bad Request</title> </head><body> <h1>Bad Request</h1> **<p>Your browser sent a request that this server could not understand.<br /> Size of a request header field exceeds server limit.<br />** <pre> Cookie: Apache=10.190.8.170.1302997118916547; (cookie body removed due to privacy reasons) </pre> </p> </body></html>

    Read the article

  • What Device/System to use as a "router on a stick"

    - by Jeff Leyser
    I need to create several distinct VLANs, and provide a way for traffic to move between them. A "router on a stick" approach seems ideal: Internet | Router with Trunking Capability ("router on a stick") * * Trunk between router and switch * Switch with Trunking Capability | | | | | | | | | | | LAN 2 | LAN 4 | | 10.0.2.0/24 | 10.0.4.0/24 | | | | LAN 1 LAN 3 LAN 5 10.0.1.0/24 10.0.3.0/24 10.0.5.0/24 We have trunk-capable Layer-2 switches. The question is what to use as the router on a stick. My choices seem to be: 1) Use an existing Cisco 5505 ASA firewall. It appears the ASA can do the routing, but it's a 100Mbps device, and so seems sub-optimal at best 2) Buy a router. This seems overkill. 3) Buy a Layer-3 switch. Also seems overkill. 4) Use an existing Linux Box as a router 5) Use a new Linux box as a router' 6) Something I'm not thinking of I think either (4) or (5) is my best option, but I'm not sure how to choose between them. I expect the amount of traffic that has to cross the VLANs to be somewhat small, but bursty. How much load does routing add to a CentOS machine?

    Read the article

  • Only allow the POST method for a specific file in a directory

    - by Dave Chen
    I have one file that should only be accessible via the POST method. /var/www/folder/index.php The document root is /var/www/ and index.php is nested inside a folder. My configurations are as follows: <Directory "/var/www/folder"> <Files "index.php"> order deny,allow Allow from all <LimitExcept POST> Deny from all </LimitExcept> </Files> </Directory> I visit my server at 127.0.0.1/folder but I can GET and POST the file just like normal. I've also tried reversing the order, order allow,deny, require, limitexcept and limit. How can I only allow POST requests to be processed by one file in a folder?

    Read the article

  • ERROR : MySQL server has gone away while running query

    - by Rashmi Nama
    I am using ubuntu 12.04 version. I am connecting properly to MariaDB from command prompt,I have a database named Dealer and have some tables in it but when i running any query, it gives an error.My steps as follow: mysql -uroot -proot use dealer; select * from dealer_outlet limit 1; now error occours ERROR 2006 (HY000): MySQL server has gone away No connection. Trying to reconnect... Connection id: 3 Current database: dealer ERROR 2006 (HY000): MySQL server has gone away No connection. Trying to reconnect... ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (111 "Connection refused") ERROR: Can't connect to the server

    Read the article

  • Count the Number of Characters in a Full FIle Path?

    - by Richie086
    I need to be able to count the number of characters in a full path to a file in windows. How I am currently accomplishing this task is as follows: Open a command prompt cd to the directory in question (for example c:\CruiseControl\ProjectArtifacts\ProjectName) type the following command dir /s /b output.csv Open the resulting output.csv file in excel. use the =LEN() function in excel to count the number of characters per row as listed in the output.csv file. Does anyone know of an explorer shell extension, or some 3rd party tool that could preform this function without me having to manipulate the output from dir in excel? Is there some easier way to go about doing this? The root of the issue I am having is the ~260 character file path limit in Windows. I am trying to analyze which paths are approx ~260 characters so I can shorten them to avoid getting this error.

    Read the article

  • ideal memory configuration 4 bank, ddr3, AM3+ FX - 1 vs 2 vs 4 dimms?

    - by TardisGuy
    Ok, so ive been looking around, trying to learn and understand the way that ram works. Ive gotten one answer that said "The addressing is best for 2 sticks, and when you use 4; it slows down" Another answer said something like: Theres bank/channel interleave that makes the memory read like one stick Also I read something about the memory density also being a factor. I dug further and found out that theres a higher speed limit on my board for 2 sticks vs 4, so now im trying to put an image in my head of how and why, and... pfft. Can anyone explain, or recommend a resource that would answer these questions?

    Read the article

  • SQL Server 2005 64bit on 16GB machine uses 3.6GB memory only

    - by ArjanP
    Maximum server memory is set to the maxvalue AWE is disabled (should not be needed in 64 bit anyway) Windows Server 2008 Enterprise SP2 It is a virtual server using VMWare If I look in Task Manager the sqlservr.exe process only uses about 3.6 GB of memory. Is that number not real? Shouldn't it attempt to use all available memory? If I run DBCC MEMORYSTATUS I get: VM Reserved 16670136 VM Committed 3640664 It looks like a memory limit I shouldn't be seeing in a 64 bit environment.. how can I get SQL2005 to use more memory?

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >