Search Results

Search found 10693 results on 428 pages for 'stay updated'.

Page 37/428 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • yum error when installing memcached

    - by Jack
    Hi, trying to install memcached with "yum install memcached" and i'm getting all these errors which I have no idea how to solve. Setting up Install Process Resolving Dependencies -- Running transaction check --- Package memcached.x86_64 0:1.4.5-1.el5.rf set to be updated -- Processing Dependency: perl(AnyEvent) for package: memcached -- Processing Dependency: perl(AnyEvent::Socket) for package: memcached -- Processing Dependency: perl(AnyEvent::Handle) for package: memcached -- Processing Dependency: perl(YAML) for package: memcached -- Processing Dependency: perl(Term::ReadKey) for package: memcached -- Processing Dependency: libevent-1.1a.so.1()(64bit) for package: memcached -- Running transaction check --- Package compat-libevent-11a.x86_64 0:3.2.1-1.el5.rf set to be updated --- Package memcached.x86_64 0:1.4.5-1.el5.rf set to be updated -- Processing Dependency: perl(AnyEvent) for package: memcached -- Processing Dependency: perl(AnyEvent::Socket) for package: memcached -- Processing Dependency: perl(AnyEvent::Handle) for package: memcached -- Processing Dependency: perl(YAML) for package: memcached -- Processing Dependency: perl(Term::ReadKey) for package: memcached -- Finished Dependency Resolution memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(AnyEvent::Socket) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(AnyEvent) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(AnyEvent::Handle) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(YAML) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(Term::ReadKey) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) Packages skipped because of dependency problems: compat-libevent-11a-3.2.1-1.el5.rf.x86_64 from rpmforge memcached-1.4.5-1.el5.rf.x86_64 from rpmforge The perl modules that its complaining about are already installed. Any ideas?

    Read the article

  • Read Only Domain Controllers and DNS zone updates

    - by Mike M
    I have a Windows 2003 domain and just added a new DC that runs 2008 R2. I updated the schema accordingly for both forest and domain levels. I also made sure to run /rodcprep at the time I did this. I have a branch office with a 2008 R2 file/print server that is a read-only domain controller (DC). The one problem I have been having is with AD-integrated DNS records updates. In the data center, we had to make an IP address change on a particular server. All our other sites' DCs (2003) updated the record fine. The 2008 R2 DC in the data center also updates its record fine. However, the RODC in the branch office does not. So if I nslookup the target server on a 2003 DC, the IP address is correct. Same with the 2008 R2 DC in the data center. But an nslookup on the branch office RODC still pulls in the old IP address. Moreover, any new records we've created (e.g., just added a new terminal server) do not get updated on the branch RODC either. Is there something simple I'm missing? How do I get the RODC to sync its AD-integrated DNS records with the rest of my world? Thank you in advance for your responses. Mike

    Read the article

  • Refresh file access time under Linux / Discard disk read cache

    - by calandoa
    I am making use of the access time to analyse some build process, but it is not working the way I want: the access time is updated the first time I read the file, then it stays the same for a long while, or until the next reboot. For instance: $ ll -u some_file -rw-r--r-- 1 root root 1.3M 2010-04-07 10:03 some_file $ grep abcdef some_file $ ll -u some_file -rw-r--r-- 1 root root 1.3M 2010-04-07 11:24 some_file # The access time is updated # waiting a few minutes... $ grep abcdef some_file $ ll -u some_file -rw-r--r-- 1 root root 1.3M 2010-04-07 11:24 some_file # The access time has not been updated :( I suppose that the file is buffered by Linux in the free memory, the only this copy is accessed the subsequent times for speed reasons. A solution would be to discard the buffers in memory. After searching some forums, I found: sync echo 1 > /proc/sys/vm/drop_caches echo 2 > /proc/sys/vm/drop_caches echo 3 > /proc/sys/vm/drop_caches But it is not working, it seems that it only sync up the write buffers, not the read ones. May be it is due to some custom kernel configuration on my distro (fedora 9)? Or I am missing something here? Is there a way to achieve this access time refresh? Note also that I do not want to simulate some writes on my entire file tree. Because I am using some makefile based build system, this will cause the entire project to be build again.

    Read the article

  • CentOS 6.0 yum audacity dependency

    - by Kaemic
    I'm trying to install audacity on centos using yum and I cant force yum to resolve all dependencies, here's what i got, can someone help me with it (when I download the rpm file and click-installit i get the dependency problem too): # yum --disablerepo=c6-media install audacity-1.2.3-2.2.el4.rf.i386.rpm Loaded plugins: fastestmirror, refresh-packagekit Loading mirror speeds from cached hostfile * base: mirror.karneval.cz * centosplus: mirror.karneval.cz * extras: centos.vieth-server.de * rpmforge: fr2.rpmfind.net * updates: mirror.karneval.cz Setting up Install Process Examining audacity-1.2.3-2.2.el4.rf.i386.rpm: audacity-1.2.3-2.2.el4.rf.i386 Marking audacity-1.2.3-2.2.el4.rf.i386.rpm to be installed Resolving Dependencies --> Running transaction check ---> Package audacity.i386 0:1.2.3-2.2.el4.rf set to be updated --> Processing Dependency: wxGTK >= 2.4.0 for package: audacity-1.2.3-2.2.el4.rf.i386 --> Processing Dependency: libwx_gtk-2.4.so.0 for package: audacity-1.2.3-2.2.el4.rf.i386 --> Processing Dependency: libwx_gtk-2.4.so.0(WXGTK_2.4) for package: audacity-1.2.3-2.2.el4.rf.i386 --> Running transaction check ---> Package audacity.i386 0:1.2.3-2.2.el4.rf set to be updated --> Processing Dependency: libwx_gtk-2.4.so.0 for package: audacity-1.2.3-2.2.el4.rf.i386 --> Processing Dependency: libwx_gtk-2.4.so.0(WXGTK_2.4) for package: audacity-1.2.3-2.2.el4.rf.i386 ---> Package wxGTK.i686 0:2.8.12-1.el6.rf set to be updated --> Finished Dependency Resolution Error: Package: audacity-1.2.3-2.2.el4.rf.i386 (/audacity-1.2.3-2.2.el4.rf.i386) Requires: libwx_gtk-2.4.so.0 Error: Package: audacity-1.2.3-2.2.el4.rf.i386 (/audacity-1.2.3-2.2.el4.rf.i386) Requires: libwx_gtk-2.4.so.0(WXGTK_2.4) You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigest

    Read the article

  • Dell Studio 1555 not starting up

    - by Abhishek
    This is a 3-year-old laptop. I never had a big problem with it until now. I updated Kubuntu the night before yesterday. And Firefox got updated to version 18 and a few other related packages got updated. Then I shut down the laptop and restarted it, but it failed to start. I could hear the fan and the hard disk and the optical disk drive initialize. And the power button also lighted up. But there was no video - no POST or BIOS menu. I even opened the laptop up to the point when only the motherboard was the only thing attached to the base cover. I took it to the technician this evening. He checked it casually, and said that it might be a motherboard problem and will cost quite a bit to fix. Though he was not sure and said that he will give me a call after confirming the problem. Has anyone else had the same problem? What was it and what was the fix?

    Read the article

  • Finding useful crash-information in Windows 8 Consumer Preview

    - by Lukas Knuth
    I'm currently diving into C# and wanted to play around with the new Metro-styled-applications introduced with Windows 8, so I updated my Windows 7 to Windows 8 Consumer Preview. The problem I'm facing right now is, that the system freezes after 3-5 minutes. It does not take any input from the keyboard or mouse and it does not recover (at least not in less then 10 minutes). Since I have a background in Linux, I'd like to find some information about the cause of the freeze, but I have no idea where to search. I checked the system-logs (under "System Control" - "Management") but they only record that the system was shut down unexpectedly (doe to the face that I held down the power-button to reboot the PC). There is no useful crash-information in there. I don't want to spend hours on randomly reinstalling drivers and doing things that "might help". Isn't there any place I can find some useful information about the freeze? Before you ask: I installed Windows 8 as an updated on my old Windows 7 installation (which worked fine by the way). My hardware fits the minimum requirements (specs can be found here, the MacMini 3,1 model with 2GHz processor). I have updated the graphics-card drivers to the newest Windows 8 drivers from nVidia.

    Read the article

  • Win 7 crashes, PC reboots and says "Hard drive 0 not found" until I turn if off and on again

    - by Danny T.
    I recently made the move from Windows XP to Windows 7. Since then, when my computer is on for a few hours it always ends up rebooting without warning. Then the BIOS won't recognize my hard drive (hard drive 0 not found). If I turn off my computer and then on again, it boots normally. Some details: Dell Dimension 9150 Windows 7 I updated the BIOS I updated all system drivers with the latest version from Dell (SATA, Chipset, etc.) Other drivers updated too (Graphic card, sound, etc.) There is one thing that I tried after some Googling: I turned off the DMA access to the drives, but it's still rebooting after a few hours. Any clue? UPDATE 2010/12/13 Here are the events from the Event Log for today, from when I turned the computer on until it crashed: 19:17 - Error - ID 10016 - DistributedCom 20:06 - Error - ID 1008 - Customer Improvement Program (could not send data to Microsoft) 21:48 - Critical - ID 41 - Kernel-Power (System was restarted without proper shutdown) 21:48 - Error - ID 6008 - EventLog (Previous system down was not planned) 21:48 - Error - ID 1101 - EventLog (Audit Event ignored) 21:49 - Error - ID 10016 - DistributedCom Both DistributedCom events have a description along these lines (translated from French): The authorisation parameters specific to the application are not allowing Local Execeution for the COM server application with the CLSID {C97FCC79-E628-407D-AE68-A06AD6D8B4D1} and the APPID {344ED43D-D086-4961-86A6-1106F4ACAD9B} to the SID AUTHORITY NT\User System (S-1-5-18) from the address LocalHost (LRPC usage). This security authorisation can be changed with the Component service administration tool. UPDATE 2010/12/31 Here are the error messages I have on blue screens : STOP C000007xA - Kernel_Data_Inpage_Error "Unkown hard error" C00000135 - Can't start because &hs is missing

    Read the article

  • Losing internet connection after few minutes (5-10 maybe)

    - by Korchkidu
    I took a computer that was not updated for months. Internet was working just fine so basically, I updated zonealarm, avast and installed all windows updates and especially SP3. After that, when I reboot, Internet works fine but after few minutes, Firefox says that the connection was reset. IE does not work either. However, my connection is still up and running as I can make a ping on www.google.com for example. Here are the solutions I tried with no success so far: 1) Uninstalling SP3; 2) Uninstalling IE8 and IE7; 3) Manually setting DNS and IPs; 4) Removed proxy settings from Firefox and IE; 5) Restarting DNS and DHCP related services; 6) Reset TCP/IP with netsh int ip reset c:\resetlog.txt; 7) Updated my ehternet card driver; 8) Restarted, tweaked all the connections in any directions and any configuration possible I believe; 9) Disabled Zone Alarm and Avast; Also, update kb981793 always fails on install. Please, help me as I spent two days already on this and I cannot find any solution. If I cannot fix this problem tomorrow, I will have to format-reinstall everything. Thanks for any help. Regards.

    Read the article

  • How to see the properties of a DOM element as they change in realtime?

    - by allquixotic
    JavaScript code can update the properties/attributes of DOM elements in real time by responding to events and so on. Here is an example. In the table on that page, move your mouse over the cells. Notice how they change color when the mouse is on them, and the color goes away when you move the mouse to another cell. Now, using Firefox or Chrome (but not IE, Opera, etc.), I want to examine the background color, expressed in RGB or hex or whatever, of the cells updated in real time, as the mouse cursor enters and leaves the region and causes the JS to do its thing. The behavior that I observe, currently, is that the Inspect Element functionality of both Firefox and Chrome does not update the value of the properties as they are updated by JavaScript. So, in order to view the latest value of the property, I have to inspect the element again, and it takes a momentary "snapshot" of the values. But since the values only change while I have the mouse on them, I can't take a snapshot of the value I want while my mouse cursor is over the cell, because I have to remove my mouse from the cell to select the "Inspect Element" item in the right-click list! If it is possible to have the values updated in real time using either Firefox or Chrome, or an extension, on any recent version of the software (up to the latest stable), please provide instructions for doing so.

    Read the article

  • Managing SharePoint permissions via Active Directory?

    - by rgmatthes
    My company has thousands of employees organized thoroughly via Active Directory. I have confidence in the accuracy of the Department and Title information displayed in the user profiles. I'm helping to put up a brand new SharePoint 2007 site, and I contacted IT about managing the site's permissions through AD Groups. The goal is to have the site automatically assign read/write/contribute/whatever permissions based on the information in AD. For example, we could create an AD Group called "Managers" that would contain anyone with the "Manager" title in their AD user profile. I would have SharePoint tap into this AD Group to mass assign permissions if I knew all managers would need a certain level of access (read/write/contribute/whatever). Then if a manager joins the company or leaves it, the group is automatically updated (provided AD gets updated, of course). My IT rep called back and said it couldn't be done. This seems like a pretty straightforward business requirement, and one of the huge benefits of having Active Directory, but maybe I'm mistaken. Could anyone shed some light on this? A) Is it possible to use dynamically-updated AD Groups when assigning permissions via SharePoint? (Does anyone know of a guide I could show my doubtful IT rep?) B) Is there a "best practice" way to go about this? I've read some debate on whether SharePoint Groups or AD Groups are the way to go. My main concern is dynamic updating. C) If this isn't available out of the box, can someone recommend third-party software that will provide the functionality I'm looking for? A big thanks to anyone who can help me out!!

    Read the article

  • Force request to miss cache but still store the response

    - by Tom Marthenal
    I have a slow web app that I've placed Varnish in front of. All of the pages are static (they don't vary for a different user), but they need to be updated every 5 minutes so they contain recent data. I have a simple script (wget --mirror) that crawls the entire website every 15 minutes. Each crawl takes about 5 minutes. The point of the crawl is to update every page in the Varnish cache so that a user never has to wait for the page to generate (since all pages have been generated recently thanks to the spider). The timeline looks like this: 00:00:00: Cache flushed 00:00:00: Spider starts crawling to update cache with new pages 00:05:00: Spider finishes crawling, all pages are updated until 1:15 A request that comes in between 0:00:00 and 0:05:00 might hit a page that hasn't been updated yet, and will be forced to wait a few seconds for a response. This isn't acceptable. What I'd like to do is, perhaps using some VCL magic, always foward requests from the spider to the backend, but still store the response in the cache. This way, a user will never have to wait for a page to generate since there is no 5-minute window in which parts of the cache are empty (except perhaps at server startup). How can I do this?

    Read the article

  • Managing SharePoint permissions via Active Directory?

    - by rgmatthes
    My company has thousands of employees organized thoroughly via Active Directory. I have confidence in the accuracy of the Department and Title information displayed in the user profiles. I'm helping to put up a brand new SharePoint 2007 site, and I contacted IT about managing the site's permissions through AD Groups. The goal is to have the site automatically assign read/write/contribute/whatever permissions based on the information in AD. For example, we could create an AD Group called "Managers" that would contain anyone with the "Manager" title in their AD user profile. I would have SharePoint tap into this AD Group to mass assign permissions if I knew all managers would need a certain level of access (read/write/contribute/whatever). Then if a manager joins the company or leaves it, the group is automatically updated (provided AD gets updated, of course). My IT rep called back and said it couldn't be done. This seems like a pretty straightforward business requirement, and one of the huge benefits of having Active Directory, but maybe I'm mistaken. Could anyone shed some light on this? A) Is it possible to use dynamically-updated AD Groups when assigning permissions via SharePoint? (Does anyone know of a guide I could show my doubtful IT rep?) B) Is there a "best practice" way to go about this? I've read some debate on whether SharePoint Groups or AD Groups are the way to go. My main concern is dynamic updating. C) If this isn't available out of the box, can someone recommend third-party software that will provide the functionality I'm looking for? A big thanks to anyone who can help me out!!

    Read the article

  • How to ensure local file is up-to-date or ahead (dropbox sync) before truecrypt auto-mount it?

    - by user620965
    There are a lot tutorials out there that states that dropbox build-in encryption is not secure enought. That tutorials recommands to sync a truecrypt container file to have all files in it securely encrypted. This setup is know to be limited. You can NOT have that truecrypt container file mounted on the same time on more than one location - if you have inserted changes to the contents of the container in more then one location at a time then this setup produces a conflict on the container file in the dropbox system - resulting in one container file for each location. In my case that issue is not relevant - i do not use my data on more than one location at a time. I want to use the auto-mount feature of truecrypt on startup of windows 7 to have a zero configuration environment - and start working right away. But i want to ensure that the local truecrypt container file is up-to-date before truecrypt mounts it automatically - imagine you updated the contents of the container on your primary location and your secondary location was off for a long time. In that case it can take "a long time" till dropbox sync is complete (e.g. depending on your internet connection and the size of the container file). There is a option in truecrypt that ensures that truecrypt do not update the timestamp of the container file - which speeds up the sync, because dropbox client is doing a differential sync then instead of a time consuming full-sync. That is an improvement to that setup, but this do not fix my issue. The question is how to make the auto-mount function wait for the container file to be up-to-date (updated by dropbox)? In contrast: if the file was changed local, but remote file (in the dropbox cloud system) is still old (not jet updated by the sync process / or process is progress), should not make truecrypt to wait for the sync. Suggestions?

    Read the article

  • Use Dojo Drag and Drop together with Dojo Moveable

    - by Select0r
    Hi, I'm using Dojo.dnd to transfer items between to areas. The problem is: the items will snap into place once I drop them, but I'd like to have them stay where I drop them, but only for one area. Here's a little code to explain this better: <div id="dropZone" class="dropZone"> <div id="itemNodes"></div> <div id="targetZone" dojoType="dojo.dnd.Source"></div> </div> "dropZone" is a DIV that contains two dojo.dnd.Source-areas, "itemNodes" (created programmatically) and "targetZone". Items (DIVs with images) should be dragged from a simple list out of "itemNodes" into "targetZone" and stay where they are dropped. As soon as they are dragged out of "targetZone" they should snap back to the list inside "itemNodes". Here's the code I use to create the items: var nodelist = new dojo.dnd.Source("itemNodes"); {Smarty-Loop} nodelist.insertNodes(false, ['<img class="dragItem" src="{$items->info.itemtext}" alt="{$items->info.itemtext}" border="0" />']); {/Smarty-Loop} But this way I just have two lists of items, the items dropped into "targetZone" won't stay where I dropped them. I've tried a loop dojo.query(".dojoDndItem").forEach(function(node) to grab all items and change them to a "moveable"-type: using dojo.dnd.move.constrainedMoveable will change the items so they can always be moved around (even in "itemNodes") using dojo.dnd.move.boxConstrainedMoveable and defining the "box" to the borders of "targetZone" makes it possible to just move the items around inside "targetZone", but as soon as I drop them, I can't grab and move them back out So here's the question: is it possible to create two dnd.Sources where I can move items back and forth and let the items be "moveable" only in one of the sources?Or is there a workaround like making the items moveable and if they're not dropped into "targetZone" they'll be moved back to the list in "itemNodes" automatically? Once the page is submitted, I have to save the position of every item that has been placed into "targetZone". (The next step will be positioning the items inside "targetZone" on page load if the grid has already been filled before, but I'd be happy to just get the thing working in the first place.) Any hint is appreciated. Greetings, Select0r

    Read the article

  • Convert and convert back datetime in Django

    - by Anshuma
    I am parsing feeds using feedparser and I am trying to store updated or updated_parsed attributes of feeds in Django db. But it shows an error as [u'Enter a valid date/time in YYYY-MM-DD HH:MM[:ss[.uuuuuu]] format.'] Please tell me how to convert updated and updated_parsed such that it can be stored in the Django db such that I can (convert and reuse) or just reuse the date stored in db while parsing in this way: feedparser.parse("url", modified = lastupdate)

    Read the article

  • Testing Django Inline ModelForms: How to arrange POST data?

    - by unclaimedbaggage
    Hi folks, I have a Django 'add business' view which adds a new business with an inline 'business_contact' form. The form works fine, but I'm wondering how to write up the unit test - specifically, the 'postdata' to send to self.client.post(settings.BUSINESS_ADD_URL, postdata) I've inspected the fields in my browser and tried adding post data with corresponding names, but I still get a 'ManagementForm data is missing or has been tampered with' error when run. Anyone know of any resources for figuring out how to post inline data? Relevant models, views & forms below if it helps. Lotsa thanks. MODEL: class Contact(models.Model): """ Contact details for the representatives of each business """ first_name = models.CharField(max_length=200) surname = models.CharField(max_length=200) business = models.ForeignKey('Business') slug = models.SlugField(max_length=150, unique=True, help_text=settings.SLUG_HELPER_TEXT) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) phone = models.CharField(max_length=100, null=True, blank=True) mobile_phone = models.CharField(max_length=100, null=True, blank=True) email = models.EmailField(null=True) deleted = models.BooleanField(default=False) class Meta: db_table='business_contact' def __unicode__(self): return '%s %s' % (self.first_name, self.surname) @models.permalink def get_absolute_url(self): return('business_contact', (), {'contact_slug': self.slug }) class Business(models.Model): """ The business clients who you are selling products/services to """ business = models.CharField(max_length=255, unique=True) slug = models.SlugField(max_length=100, unique=True, help_text=settings.SLUG_HELPER_TEXT) description = models.TextField(null=True, blank=True) primary_contact = models.ForeignKey('Contact', null=True, blank=True, related_name='primary_contact') business_type = models.ForeignKey('BusinessType') deleted = models.BooleanField(default=False) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) address_1 = models.CharField(max_length=255, null=True, blank=True) address_2 = models.CharField(max_length=255, null=True, blank=True) suburb = models.CharField(max_length=255, null=True, blank=True) city = models.CharField(max_length=255, null=True, blank=True) state = models.CharField(max_length=255, null=True, blank=True) country = models.CharField(max_length=255, null=True, blank=True) phone = models.CharField(max_length=40, null=True, blank=True) website = models.URLField(null=True, blank=True) class Meta: db_table = 'business' def __unicode__(self): return self.business def get_absolute_url(self): return '%s%s/' % (settings.BUSINESS_URL, self.slug) VIEWS: class Contact(models.Model): """ Contact details for the representatives of each business """ first_name = models.CharField(max_length=200) surname = models.CharField(max_length=200) business = models.ForeignKey('Business') slug = models.SlugField(max_length=150, unique=True, help_text=settings.SLUG_HELPER_TEXT) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) phone = models.CharField(max_length=100, null=True, blank=True) mobile_phone = models.CharField(max_length=100, null=True, blank=True) email = models.EmailField(null=True) deleted = models.BooleanField(default=False) class Meta: db_table='business_contact' def __unicode__(self): return '%s %s' % (self.first_name, self.surname) @models.permalink def get_absolute_url(self): return('business_contact', (), {'contact_slug': self.slug }) class Business(models.Model): """ The business clients who you are selling products/services to """ business = models.CharField(max_length=255, unique=True) slug = models.SlugField(max_length=100, unique=True, help_text=settings.SLUG_HELPER_TEXT) description = models.TextField(null=True, blank=True) primary_contact = models.ForeignKey('Contact', null=True, blank=True, related_name='primary_contact') business_type = models.ForeignKey('BusinessType') deleted = models.BooleanField(default=False) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) address_1 = models.CharField(max_length=255, null=True, blank=True) address_2 = models.CharField(max_length=255, null=True, blank=True) suburb = models.CharField(max_length=255, null=True, blank=True) city = models.CharField(max_length=255, null=True, blank=True) state = models.CharField(max_length=255, null=True, blank=True) country = models.CharField(max_length=255, null=True, blank=True) phone = models.CharField(max_length=40, null=True, blank=True) website = models.URLField(null=True, blank=True) class Meta: db_table = 'business' def __unicode__(self): return self.business def get_absolute_url(self): return '%s%s/' % (settings.BUSINESS_URL, self.slug) FORMS: class AddBusinessForm(ModelForm): class Meta: model = Business exclude = ['deleted','primary_contact',] class ContactForm(ModelForm): class Meta: model = Contact exclude = ['deleted',] AddBusinessFormSet = inlineformset_factory(Business, Contact, can_delete=False, extra=1, form=AddBusinessForm, )

    Read the article

  • UIButton Origin and Device Orientation

    - by Ward
    Hey there, I might be crazy, but for some reason this problem is stumping me. I have a view controller that is set to auto rotate to orientation. Inside I've got two subviews. One is a uibutton. All I want is for my button's origin to stay locked to the bottom-left in portrait and bottom-right in landscape (so it basically stays in the same place). It should also stay in place for portrait upside down and landscape right. Any ideas? Thanks, Howie

    Read the article

  • Winforms: Embedded NumericUpDown control inside ListView

    - by tanthiamhuat
    say in my ListView say with 4 columns (Description, Price Per Unit, Quantity, Total Price). I would like to make the third column editable, and embedded NumericUpDown control for the Quantity column. Is it possible? And when the Quantity is updated via the NumericUpDown control, the Total Price is also being updated based on Total Price = Quantity * Price Per Unit. is the above achievable? any code samples would be greatly appreciated.

    Read the article

  • Git for Websites / post-receive / Separation of Test and Production Sites

    - by Walt W
    Hi all, I'm using Git to manage my website's source code and deployment, and currently have the test and live sites running on the same box. Following this resource http://toroid.org/ams/git-website-howto originally, I came up with the following post-receive hook script to differentiate between pushes to my live site and pushes to my test site: while read ref do #echo "Ref updated:" #echo $ref -- would print something like example at top of file result=`echo $ref | gawk -F' ' '{ print $3 }'` if [ $result != "" ]; then echo "Branch found: " echo $result case $result in refs/heads/master ) git --work-tree=c:/temp/BLAH checkout -f master echo "Updated master" ;; refs/heads/testbranch ) git --work-tree=c:/temp/BLAH2 checkout -f testbranch echo "Updated testbranch" ;; * ) echo "No update known for $result" ;; esac fi done echo "Post-receive updates complete" However, I have doubts that this is actually safe :) I'm by no means a Git expert, but I am guessing that Git probably keeps track of the current checked-out branch head, and this approach probably has the potential to confuse it to no end. So a few questions: IS this safe? Would a better approach be to have my base repository be the test site repository (with corresponding working directory), and then have that repository push changes to a new live site repository, which has a corresponding working directory to the live site base? This would also allow me to move the production to a different server and keep the deployment chain intact. Is there something I'm missing? Is there a different, clean way to differentiate between test and production deployments when using Git for managing websites? As an additional note in light of Vi's answer, is there a good way to do this that would handle deletions without mucking with the file system much? Thank you, -Walt PS - The script I came up with for the multiple repos (and am using unless I hear better) is as follows: sitename=`basename \`pwd\`` while read ref do #echo "Ref updated:" #echo $ref -- would print something like example at top of file result=`echo $ref | gawk -F' ' '{ print $3 }'` if [ $result != "" ]; then echo "Branch found: " echo $result case $result in refs/heads/master ) git checkout -q -f master if [ $? -eq 0 ]; then echo "Test Site checked out properly" else echo "Failed to checkout test site!" fi ;; refs/heads/live-site ) git push -q ../Live/$sitename live-site:master if [ $? -eq 0 ]; then echo "Live Site received updates properly" else echo "Failed to push updates to Live Site" fi ;; * ) echo "No update known for $result" ;; esac fi done echo "Post-receive updates complete" And then the repo in ../Live/$sitename (these are "bare" repos with working trees added after init) has the basic post-receive: git checkout -f if [ $? -eq 0 ]; then echo "Live site `basename \`pwd\`` checked out successfully" else echo "Live site failed to checkout" fi

    Read the article

  • NHibernate and mysql timestamp

    - by HeavyWave
    I am trying to do versioning with NHibernate and everything works fine, however right after the insert NHibernate tries to pull the generated timestamp by executing the following query: SELECT profileloc_.Updated as Updated14_ FROM profile_locale profileloc_ WHERE profileloc_.id=?p0 and profileloc_.culture=?p1;?p0 = 16, ?p1 = 1033 Which is totally wrong, as it will pull out all versions starting with the first one. How do I make it add ORDER BY Updated DESC to this query? I am using Fluent NHibernate for mappings.

    Read the article

  • How do I use the subversion revision of a css file to prevent browser cache

    - by Clayton
    StackOverflow implements it like this: <link rel="stylesheet" href="http://sstatic.net/so/all.css?v=4542"> Every time the referenced files change, the href attribute of the link tag is updated in the HTML code, thus supporting caching and updated referenced files. My question - how do you retrieve the subversion version of that css file to include in the link? Subversion keywords only tell you the revision of the file you are currently in. I'm working with PHP/CodeIgniter + jQuery.

    Read the article

  • Delete manytomanyfield in Django

    - by Mike
    I have the following models class Database(models.Model): user = models.ForeignKey(User) name = models.CharField(max_length=100) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) class DatabaseUser(models.Model): user = models.ForeignKey(User) name = models.CharField(max_length=100) password = models.CharField(max_length=100) database = models.ManyToManyField(Database) created = models.DateTimeField(auto_now_add=True) updated = models.DateTimeField(auto_now=True) One DatabaseUser can have many Databases under it's control. The issue I have if I go to delete a Database it wants to Delete the DatabaseUser also.. Is there a way to stop this from happening easily?

    Read the article

  • Strange behavior in DockPanel

    - by plotnick
    I don't understand, I have a toolbar with buttons bind to custom commands. Also I have an expandable control docked to the left of window - kinda NavPanel. (Devcomponents' NavigationPane to be exact) Now, everytime when it's collapsed or expanded, buttons in the toolbar become disabled and stay like that till the focus changes. Of course, it's simple to change the focus inside Collapsed and Expanded events, but unfortunately it works only in the first and ignores the second one and all buttons stay disabled. It seems that it something to do with CommandTarget which I haven't define nowhere. Maybe I should? Any ideas?

    Read the article

  • CSS vertical centering split background image not overlapping

    - by user295292
    is it possible to split 2 images vertically and when resizing the browser, it wont overlap but stay vertically centered? can the left image stay fixed so the right side of it won't cut off(overlap) this is what i have now, but when resizing the browser smaller, it pushes the left image underneath the right. rather have the images cut off on the outer sides and never overlap each other in the middle, make sense? #wrapper { width:1680px; max-width:1680px; height:500px; margin: 0 auto; } #left-image { width: 50%; position:absolute; left: auto; height:500px; } #right-image { width: 50%; position:absolute; right: 0px; height:500px; }

    Read the article

  • updatepanel problem or possible bug

    - by Woland
    I have textbox and lable in update panel witch i refresh with javascript __doPostBack(upEditReminder,id); then i set both label and textbox text to current datetime protected void upReminder_Onload(object sender, EventArgs e) { lbTest.Text = DateTime.Now.ToString(); tbReminder.Text = DateTime.Now.ToString(); Problem is that lable is updated but textbox date is updated only once when the page is loaded but not when __doPostBack(upEditReminder,id); is triggered. I cant figure out whats the problem Your help is much appreciated

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >