Search Results

Search found 3615 results on 145 pages for 'cron daily'.

Page 61/145 | < Previous Page | 57 58 59 60 61 62 63 64 65 66 67 68  | Next Page >

  • Update to kernel 3.12 seems to fail: uname reports old rc7

    - by carlo
    I currently run Xubuntu 13.10 with kernel 3.12 rc7. Today I tried updating to the latest 3.12 kernel (non-rc), but this seems to fail. When installing the image and headers I see the following error passing by: ... run-parts: executing /etc/kernel/postinst.d/dkms 3.12.0-031200-generic /boot/vmlinuz-3.12.0-031200-generichis Error! The dkms.conf for this module includes a BUILD_EXCLUSIVE directive which does not match this kernel/arch. This indicates that it should not be built. ... After rebooting, when I do uname -r or cat /proc/version it tells me that I'm still running on the old rc7 kernel. Since my microphone wasn't working on my Sony Vaio Pro 13 I did download and install the latest ALSA drivers using the oem-audio-hda-daily-dkms package which seem to fix the problem (with the mic). Maybe this has something to do with it? I also tried removing the package using sudo apt-get purge oem-audio-hda-daily-dkms but no success.

    Read the article

  • Is the development of CLI apps considered "backward"?

    - by user61852
    I am a DBA fledgling with a lot of experience in programming. I have developed several CLI, non interactive apps that solve some daily repetitive tasks or eliminate the human error from more complex albeit not so daily tasks. These tools are now part of our tool box. I find CLI apps are great because you can include them in an automated workflow. Also the Unix philosophy of doing a single thing but doing it well, and letting the output of a process be the input of another, is a great way of building a set of tools than would consolidate into an strategic advantage. My boss recently commented that developing CLI tools is "backward", or constitutes a "regression". I told him I disagreed, because most CLI tools that exist now are not legacy but are live projects with improved versions being released all the time. Is this kind of development considered "backwards" in the market? Does it look bad on a rèsumè? I also considered all solutions whether they are web or desktop, should have command line, non-interactive options. Some people consider this a waste of programming resources. Is this goal a worthy one in a software project?

    Read the article

  • Automatic TRIM vs. manual TRIM

    - by Eike Cochu
    I am currently trying to find out how to trim with my new TP and was wondering about the difference of manual/online trimming. Here is my setup: ThinkPad T430s with SSD Samsung 830, 128GB and Xubuntu 12.10, here are some outputs to check if trim will work on my system (got these from here: http://wiki.ubuntuusers.de/SSD/TRIM) root@eike-tp:~# sudo hdparm -I /dev/sda | grep -i TRIM * Data Set Management TRIM supported (limit 8 blocks) First, I tried the online trimming: How to enable TRIM? my fstab with discard inserted: UUID=d6c49c17-a4f1-466c-9f7e-896c20db3bba / ext4 discard,noatime,errors=remount-ro 0 1 # swap was on /dev/sda5 during installation UUID=a0322f5f-c6c1-4896-863f-668f0638d8cf none swap sw 0 0 tmpfs /tmp tmpfs defaults,noatime,mode=1777 0 0 I tried to test if it works (but I don't get any zeroes when I try it with /dev/sda), but found out that this method is only possible with SSD type 2 and I seem to have type 3. So I don't know if it works or not. The Ubuntuwiki (first link) recommends manual trimming, so I set up a daily cronjob instead of discard: #!/bin/sh LOG=/var/log/batched_discard.log echo "*** $(date -R) ***" >> $LOG fstrim -v / >> $LOG the wiki article suggests weekly or daily. Now to my questions: How often executes the automated trim? How often is recommended? Online vs. manual trimming? Thank you for your help

    Read the article

  • Antenna Aligner Part 8: It's Alive!!!

    - by Chris George
    Finally the day has come, Antenna Aligner v1.0.1 has been uploaded to the AppStore and . "Waiting for review" .. . fast forward 7 days and much checking of emails later WOO HOO! Now what? So I set my facebook page to go live  https://www.facebook.com/AntennaAligner, and started by sending messages to my mates that have iphones! Amazingly a few of them bought it! Similarly some of my colleagues were also kind enough to support me and downloaded it too! Unfortunately the only way I knew they had bought is was from them telling me, as the iTunes connect data is only updated daily at about midday GMT. This is a shame, surely they could provide more granular updates throughout the day? Although I suppose once an app has been out in the wild for a while, daily updates are enough. It would, however, be nice to get a ping when you make your first sale! I would have expected more feedback on my facebook page as well, maybe I'm just expecting too much, or perhaps I've configured the page wrong. The new facebook timeline layout is just confusing, and I'm not sure it's all public, I'll check that! So please take a look and see what you think! I would love to get some more feedback/reviews/suggestions... Oh and watch out for the Android version coming soon!

    Read the article

  • I have done everything correct on my asp.net website, re: SEO; why aren't google backlinks showing?

    - by Jason Weber
    I recently implemented many SEO techniques for a company on their asp.net website; in 6 months, we jumped from a PR1 to a PR3. But I'm having issues with google backlinking. Here are some of the things I've done: Not only did I set up their own Google+ page 6 months ago, I update it pretty much daily with links, pictures, etc., and I blog about it on my own personal Google+ page and post links, etc. ... They have their own Twitter, Facebook, YouTube, and all are updated almost daily. I've listed in as many quality, relevant directories as possible 6 months ago; I've avoided link farms. The site is solid SEO-wise. Key-phrase rich URLs, schema.org & rich snippets. No duplicate content ... www or non-www 301's, trailing slashes, etc. ... all taken care of. Probably a ton of other things, but basically, the site is all set, SEO-wise. Here's what's confounding: When I do a link:www.example.com in Bing/Yahoo, it shows many backlinks. When I do a link:www.example.com in google, it shows up 0 links. Or when I use a site-ranker like Web Site Rank Tool it's showing 0 backlinks from Google. Any suggestions would be appreciated!

    Read the article

  • What relationship do software Scrum or Lean have to industrial engineering concepts like theory of constraints?

    - by DeveloperDon
    In Scrum, work is delivered to customers through a series of sprints in which project work is time boxed to a fixed number of days or weeks, usually 30 days. In lean software development, the goal is to deliver as soon as possible, permitting early feedback for the next iteration. Both techniques stress the importance of workflow in which software work product does not accumulate in development awaiting release at some future date. Both permit new or refined requirements and feedback from QA and customers to be acted on with as little delay as possible based on priority. A few years ago I heard a lecture where the speaker talked briefly about a family of concepts from industrial engineering called theory of constraints. In the factory, they use an operations model based on three components: drum, buffer, and rope. The drum synchronizes work product as it flows through the system. Buffers that protect the system by holding output from one stage as it waits to be consumed by the next. The rope pulls product from one work station to the next. Historically, are these ideas part of the heritage of Scrum and Lean, or are they on a separate track? It we wanted to think about Scrum and Lean in terms of drum-buffer-rope, what are the parts? Drum = {daily scrum meeting, monthly release)? Buffer = {burn down list, source control system)? Rope = { daily meeting, constant integration server, monthly releases}? Industrial engineers define work flow in terms of different kinds of factories. I-Factories: straight pipeline. One input, one output. A-Factories: many inputs and one output. V-Factories: one input, many output products. T-Plants: many inputs, many outputs. If it applies, what kind of factory is most like Scrum or Lean and why?

    Read the article

  • Antenna Aligner Part 8: It's Alive!!!

    - by Chris George
    Finally the day has come, Antenna Aligner v1.0.1 has been uploaded to the AppStore and . "Waiting for review" .. . fast forward 7 days and much checking of emails later WOO HOO! Now what? So I set my facebook page to go live  https://www.facebook.com/AntennaAligner, and started by sending messages to my mates that have iphones! Amazingly a few of them bought it! Similarly some of my colleagues were also kind enough to support me and downloaded it too! Unfortunately the only way I knew they had bought is was from them telling me, as the iTunes connect data is only updated daily at about midday GMT. This is a shame, surely they could provide more granular updates throughout the day? Although I suppose once an app has been out in the wild for a while, daily updates are enough. It would, however, be nice to get a ping when you make your first sale! I would have expected more feedback on my facebook page as well, maybe I'm just expecting too much, or perhaps I've configured the page wrong. The new facebook timeline layout is just confusing, and I'm not sure it's all public, I'll check that! So please take a look and see what you think! I would love to get some more feedback/reviews/suggestions... Oh and watch out for the Android version coming soon!

    Read the article

  • Fresh install on SSD with Ubuntu and Windows Vista, using whole disk encryption for Ubuntu

    - by nategator
    I would like to do a fresh install on a OCZ Vertex Plus R2 SSD 60GB drive I purchased on the cheap. Since the AES-encryption looks like it may not work optimally for this drive, I would like to set up a dual-boot to Windows Vista (the only Windows copy I have for clean install purposes) and Ubuntu 12.04 with the best encryption scheme possible. My plan is to have Windows around just in case I need to use a program that won't work with Wine and Ubuntu as my daily OS with all of my information secured in case the laptop is ever stolen or sold. Although this setup will not provide a lot of space, I think I can squeeze both OSes and have enough for second-computer office tasks. So, my questions are: Which OS should I install first, Ubuntu or Vista? Any special considerations when partitioning the drive? How should I install Ubuntu to ensure full disk encryption for the Linux partition(s) and or my daily computing? Is there a significant performance upgrade with doing a solo install of Ubuntu instead of a dual boot setup? Will TRIM, for example, work correctly? Are there any significant security concerns with going the route of a dual-boot, other than the fact that any activity on Windows may be fully recoverable if the drive is stolen or sold? Thanks in advance!

    Read the article

  • Is the development of CLI apps considered "backwards"?

    - by user61852
    I am a DBA fledgling with a lot of experience in programming. I have developed several CLI, non interactive apps that solve some daily repetitive tasks or eliminate the human error from more complex albeit not so daily tasks. These tools are now part of our tool box. I find CLI apps are great because you can include them in an automated workflow. Also the Unix philosophy of doing a single thing but doing it well, and letting the output of a process be the input of another, is a great way of building a set of tools than would consolidate into an strategic advantage. My boss recently commented that developing CLI tools is "backwards", or constitutes a "regression". I told him I disagreed, because most CLI tools that exist now are not legacy but are live projects with improved versions being released all the time. Is this kind of development considered "backwards" in the market? Does it look bad on a rèsumè? I also considered all solutions whether they are web or desktop, should have command line, non-interactive options. Some people consider this a waste of programming resources. Is this goal a worthy one in a software project?

    Read the article

  • Disaster Recovery Example

    Previously, I use to work for a small internet company that sells dental plans online. Our primary focus concerning disaster prevention and recovery is on our corporate website and private intranet site. We had a multiphase disaster recovery plan that includes data redundancy, load balancing, and off-site monitoring. Data redundancy is a key aspect of our disaster recovery plan. The first phase of this is to replicate our data to multiple database servers and schedule daily backups of the databases that are stored off site. The next phase is the file replication of data amongst our web servers that are also backed up daily by our collocation. In addition to the files located on the server, files are also stored locally on development machines, and again backed up using version control software. Load balancing is another key aspect of our disaster recovery plan. Load balancing offers many benefits for our system, better performance, load distribution and increased availability. With our servers behind a load balancer our system has the ability to accept multiple requests simultaneously because the load is split between multiple servers. Plus if one server is slow or experiencing a failure the traffic is diverted amongst the other servers connected to the load balancer allowing the server to get back online. The final key to our disaster recovery plan is off-site monitoring that notifies all IT staff of any outages or errors on the main website encountered by the monitor. Messages are sent by email, voicemail, and SMS. According to Disasterrecovery.org, disaster recovery planning is the way companies successfully manage crises with minimal cost and effort and maximum speed compared to others that are forced to make decision out of desperation when disasters occur. In addition Sun Guard stated in 2009 that the first step in disaster recovery planning is to analyze company risks and factor in fixed costs for things like hardware, software, staffing and utilities, as well as indirect costs, such as floor space, power protection, physical and information security, and management. Also availability requirements need to be determined per application and system as well as the strategies for recovery.

    Read the article

  • Reduce weight in healthy way

    - by krnites
    This post is my daily summary of activities that I am going to take to reduce my overall weight by 15 lbs. I am not an overweight person as per my height of 5.11 I have decent weight of 178.4 and good body build. But from a long time(approx 3 month) I am thinking of getting 2-3 abs (note not 6 abs) and reducing my weight to below 170 ( which I was 2 years ago). This post will work as my daily diary of what I ate, what exercises I did, what apps/software I used to monitor my weight loss. Sometime it will not contain much information but some time when I will research will contain information for people who really want to reduce weight. My current target for next few week is to run 2.5 miles everyday under 30 minutes. Eat a lot of fruits and vegetable. No burger or tacos. No meat either fat or lean. And checking body weight after every four hour. Here are reading for today10:00 AM  - 178.22:00 PM    - 178.4I have already ran for 2 miles in 25 minutes, did a little bit of shoulder exercise and had eaten small bowl of vegetable biryani and two bread.I hope this post and all coming post will give the reader the first hand experience of a person who had read every blog and articles related to reducing weight and now trying to do it by implementing all of them by self. My approach will be a healthy way of reducing weight, I am not going to starve my self or going to eat only fruits all day. I will enjoy all the food item that I like to eat and will work hard on my body. This way I will / reduce the weight naturally and will increase the flexibility, durability and immunity of my system /body.

    Read the article

  • Vaadin Calendar events not shown if overnight [migrated]

    - by B_B
    In my vaadin project there is the possibility to create events that are shown by the calendar. It does works, except when the event is overnight, let's say the night from 23th to 24th, and the calendar shows as only day the 24th. In this case the part of the event that belongs to the 24th is supposed to be shown, but it is not. When I switch to weekly view, the event is shown properly. Here is the function where I get the data and use a container for the calendar: /* Fill Calendar from database */ void updateData() { final BeanItemContainer<TypeReservationEvent> container = new BeanItemContainer<TypeReservationEvent>(TypeReservationEvent.class); Map<String, Object> parameters = new HashMap<String, Object>(); parameters.put("roomParent",chosenRoom); String query = "SELECT DISTINCT res FROM EntityReservation res, EntityRoom r, EntityTable rt WHERE res.tableId = rt.id " + "AND rt.roomParent =:roomParent"; reservationList = facade.list(query, parameters); for(EntityReservation rt : reservationList) { container.addBean(new TypeReservationEvent(rt)); } container.sort(new Object[]{"start"}, new boolean[]{true}); cal.setContainerDataSource(container, "caption", "description", "start", "end", "styleName"); // Force calendar to refresh if(selectCalViewType.getValue() == chooseWeeklyView) { setViewType(calViewType.DAILY); setViewType(calViewType.WEEKLY); } else if (selectCalViewType.getValue() == chooseDailyView) { setViewType(calViewType.WEEKLY); setViewType(calViewType.DAILY); } } TIA

    Read the article

  • Why are my backlinks not showing on google on this asp.net website with all I've done?

    - by Jason Weber
    I recently implemented many SEO techniques for a company on their asp.net website; in 6 months, we jumped from a PR1 to a PR3. But I'm having issues with google backlinking. Here are some of the things I've done: Not only did I set up their own Google+ page 6 months ago, I update it pretty much daily with links, pictures, etc., and I blog about it on my own personal Google+ page and post links, etc. ... They have their own Twitter, Facebook, YouTube, and all are updated almost daily. I've listed in as many quality, relevant directories as possible 6 months ago; I've avoided link farms. The site is solid SEO-wise. Key-phrase rich URLs, schema.org & rich snippets. No duplicate content ... www or non-www 301's, trailing slashes, etc. ... all taken care of. Probably a ton of other things, but basically, the site is all set, SEO-wise. Here's what's confounding: When I do a link:www.example.com in Bing/Yahoo, it shows many backlinks. When I do a link:www.example.com in google, it shows up 0 links. Or when I use a site-ranker like Web Site Rank Tool it's showing 0 backlinks from Google. Any suggestions would be appreciated!

    Read the article

  • Photoshop alternative with same keyboard shorcuts

    - by user292254
    tldr; Is there a program with the same keyboard shortcuts and/or user interface as PS. I know the "photoshop alternative" question has been asked so many times, but I feel this variant is worth the time to ask. I am a web designer and have been using PS daily on a windows or mac machine for about 10 years, but I do most of my development in linux. The only thing that has been keeping me from the full switch to linux is PS. I have tried many alternatives, including GIMP, but I feel the barrier to entry is too high for one reason and one reason only, the keyboard shortcuts. Over the years I have become very adept and quick in the use of PS because of keyboard shortcuts and muscle memory. I realize that many shortcuts can be customized, but many actions are just too different from PS. So my question is what programs are out there that are the closest to PS in keyboard shortcuts and interface. I don't care if I have to do a bunch of customization to get it as close as possible, but I just can't take too much time to learn a completely new workflow. Of course I realize that any software will take time and effort, I just want to make that time/effort as minimal as possible. Thanks in advance for your responses. edit I should tell you the most important features and workflow I use on a daily basis are, Smart objects, linked documents, save for web and devises.

    Read the article

  • Twitter Tuesday - Top 10 @ArchBeat Tweets - June 3-9, 2014

    - by OTN ArchBeat
    The Top 10 tweets from @OTNArchBeat for the last seven days. RT @DBAKevlar: #EM12c rel4 is out! Woohoo!! Jun 3, 2014 at 10:36 AM Top 10 Arch Community Articles for May 2014 >> props to @markrittman @kevin_mcginley @porushh et al Jun 4, 2014 at 12:52 PM Architecture of Analytics: @markrittman @kevin_mcginley >> Free OTN Virtual Tech Summit - July 9 Jun 4, 2014 at 09:13 AM My Top 10 Tweets - May 27 - June 2 #ADF #Essbase #FusionApps #Goldengate #Kscope14 #WebLogic. Jun 3, 2014 at 10:27 AM Starting and Stopping a #JavaEE Environment when using Oracle #WebLogic | Rene van Wijk #oracleace Jun 5, 2014 at 11:00 AM Video: #KScope14 Preview: @DebraLilley never stops moving, never stops learning. Jun 3, 2014 at 11:19 AM The OTNArchBeat Daily is out! Stories via @oraclebase Jun 9, 2014 at 01:47 PM Where did my MDB concurrency go? | Eric Gross #weblogic Jun 9, 2014 at 08:48 AM Exalogic Tech tips and code samples from A-Team architect Andrew Hopkinson Jun 6, 2014 at 11:47 AM The OTNArchBeat Daily is out! Stories via @KentGraziano @DBAKevlar @dbasolved Jun 3, 2014 at 01:48 PM adf, essbase,

    Read the article

  • Is file permission secured when it transferred from Ubuntu to Windows?

    - by Gaurav_Java
    I am having 9GB text file which is encrypted . This file contains some confidential data . Which is on my system(Ubuntu) and my external HDD (ntfs) . This file get daily updated and then encrypted . But it has to be shared among 2-3 (Windows) person. I defined permission so that no other person can even read this file(chmod 660). It is too large file, so I can't upload it anywhere and it get updated daily basis. But this file travel on Windows OS and Ubuntu also. Even I am having copy of this on my personal computer. Recently it was deleted by some other user over Windows . I just want to know how can I set permission over that file so that it cannot be deleted from any other operating system. If someone delete this file, then I am having data old for couple of days, which is only on my system. I gone through this question it says there is nothing. And from this question I am not able to understand how can I protect it. Can I do anything for preventing this file from being deleted. Then how can I secure this files from getting deleted any suggestion or software or ideas. Maybe I sound silly or this is stupid question. Please don't close it, thanks for any suggestion or solution.

    Read the article

  • Iterative and Incremental Principle Series 5: Conclusion

    - by llowitz
    Thank you for joining me in the final segment in the Iterative and Incremental series.  During yesterday’s segment, I discussed Iteration Planning, and specifically how I planned my daily exercise (iteration) each morning by assessing multiple factors, while following my overall Implementation plan. As I mentioned in yesterday’s blog, regardless of the type of exercise or how many increment sets I decide to complete each day, I apply the 6 minute interval sets and a timebox approach.  When the 6 minutes are up, I stop the interval, even if I have more to give, saving the extra energy to apply to my next interval set.   Timeboxes are used to manage iterations.  Once the pre-determined iteration duration is reached – whether it is 2 weeks or 6 weeks or somewhere in between-- the iteration is complete.  Iteration group items (requirements) not fully addressed, in relation to the iteration goal, are addressed in the next iteration.  This approach helps eliminate the “rolling deadline” and better allows the project manager to assess the project progress earlier and more frequently than in traditional approaches. Not only do smaller, more frequent milestones allow project managers to better assess potential schedule risks and slips, but process improvement is encouraged.  Even in my simple example, I learned, after a few interval sets, not to sprint uphill!  Now I plan my route more efficiently to ensure that I sprint on a level surface to reduce of the risk of not completing my increment.  Project managers have often told me that they used an iterative and incremental approach long before OUM.   An effective project manager naturally organizes project work consistent with this principle, but a key benefit of OUM is that it formalizes this approach so it happens by design rather than by chance.    I hope this series has encouraged you to think about additional ways you can incorporate the iterative and incremental principle into your daily and project life.  I further hope that you will share your thoughts and experiences with the rest of us.

    Read the article

  • How do I increase timeout for a cronjob/crontab?

    - by Mohit Ranka
    I have written a script that gets data from solr for which date is within the specified period, and I run the script using as a daily cron. The problem is the cronjob does not complete the task. If I manually run the script (for the same time period), it works well. If I reduce the specified time period, the script runs from the cron as well. So my guess is cronjob is timing out while running the script is there is much data to process. How do I increase the timeout for cronjob? PS - 1. The script I am running in cronjob is a bash script which runs a python script.

    Read the article

  • Magento: Rebuilding Flat Catalog Programmatically

    - by karnage
    I am using a cron to import inventory changes nightly. When I try to change a product's information (price, etc) I get the following error: Column not found: 1054 Unknown column 'e.display_price_group_0' in 'field list' I can fix this by clicking "Rebuild Flat Catalog Product" in the Cache Management panel. I setup a cron to do this programmatically using the following code: Mage :: getResourceModel( 'catalog/product_flat_indexer' ) -> rebuild(); I don't get any errors when I run the script, but the "Column not found" error persists. Does anyone know how I can rebuild the flat catalog other than through the admin interface?

    Read the article

  • PHP/Cronjob: What is my Cronjobs problem?

    - by Marius
    Hello there, I have no idea whats going on. But I have a script that looks like this. Cron job refuses to run it: include_once 'class_lib/mime_mail/mimeDecode.php'; include_once 'class_lib/Mail/IMAPv2.php'; include_once 'inc-functions.php'; include_once "$_SERVER[DOCUMENT_ROOT]/class_lib/DbFuctioneer.php"; $dbFuctioneer = new DbFuctioneer(); Everything works well when I remove: $dbFuctioneer = new DbFuctioneer(); Even when DbFuctioneer() looks like this: <?php class DbFuctioneer { function dbCountMatches( $count) { return $count; } } Does Cron have a problem with Classes in his Jobs? Thank you for your time. Kind regards, Marius

    Read the article

  • notification scheduling question

    - by sims
    Hi Stackers! I'm building an app that needs to send out notifications to users depending on user definable "notifications". So the notifications are not per event. They are arbitrary. A cron job should query the database and send out emails when it finds an event with matching criterion. The is a scheduling app. So naturally, one of the criterion is time. I think I've figured it out, but I'm sure there are better ideas out there as it seems to be a fairly common thing to do. I think I'll limit the users ability to "minutes before hand" to get notified as opposed to seconds or hours. So I have an eventA and notificationA. notificationA should be triggered if an event is due within 45 minutes. So eventA starts at 17:30. The user should be notified at 16:45. But the cron job might not run exactly at 00 seconds. So when the difference is not 45 minutes and 0 seconds it is, say, 45 minutes and 5 seconds. Notification time is past. Email doesn't get sent. User misses event. Shugar. We should also take into account that the cron job might take a long time to run. So maybe we should only trigger it every 5 minutes. So we need a bigger interval maybe. So then my guess would be to say that: if ((eventDueTime - now notificationTimeValue - interval) && (eventDueTime - now < notificationTimeValue + interval)) sendTheFrikinNotificationAlready(); It seems kind of risky if the there are thousands of notifications to send out. I guess I could make a thread for each notification and then a thread for each event that matches the criterion. That might help. Does that make sense? Any other ideas? Thanks!

    Read the article

  • Can I import sql with php?

    - by shin
    Please bear with me(beginner). I am planning to set up an application where visitors can login and play/mess around with my application. So I want to refresh my php application once a day with a cron job. But I never wrote a cron job script before. I understand that I have to truncate all the tables/data and add the initial data. With phpmyadmin, I can import data. Is there any way I can import sql file after truncate the tables? What is the best way? Do I have to create table and insert all data? Help and resources will be appreciated. Thanks in advance.

    Read the article

  • How do I automatically rebuild the Sphinx index under django-sphinx?

    - by Apreche
    I just setup django-sphinx, and it is working beautifully. I am now able to search my model and get amazing results. The one problem is that I have to build the index by hand using the indexer command. That means every time I add new content, I have to manually hit the command line to rebuild the search index. That is just not acceptable. I could make a cron job that automatically runs the indexer command every so often, but that's far from optimal. New data won't be indexed until the cron runs again. In addition, the indexer will run unnecessarily most times as my site doesn't have data being added very often. How do I set it up so that the Sphinx index will automatically rebuild itself whenever data is added to or modified in a searchable django model?

    Read the article

  • Cocoa - How to copy files to /usr/share?

    - by cyaconi
    Hi all. I'm developing an "installation" like cocoa application wich needs to take care of some http request, some file system reading, copying files to /usr/share, set up cron (not launchd) and ask some information to user. I discarded PackageMaker since I need more flexibility. Currently everything is going well, but on my last installation step, I need to: Delete my previously installed application folder (if exists). It's always the same path: /usr/share/MY_APP Create again the application folder at: /usr/share/MY_APP Copy application files to /usr/share/MY_APP Update a cron job It's very important that /usr/share/MY_APP keeps protected with administrative privileges, so a regular shouldn't delete it. What would be the best approach to implement those steps? BTW, I'm using Xcode 3.2. Thanks a lot! Carlos.

    Read the article

  • How to call another PHP script from a PHP script?

    - by Jagira
    Hello, I have a PHP script that has a runtime of 34 seconds. But it dies after 30 seconds. I guess my webhost a time limit of 30 seconds. I am thinking of splitting the script into two parts say PHP-1 and PHP-2. Can I call PHP-2 from PHP-1 and kill PHP-1? Both scripts have to run in sequence, so calling both of them using cron is not possible. [ My host provides cron with interval 5 mins and does not allow to change the start time] -Will this circumvent the time limit set by host?

    Read the article

< Previous Page | 57 58 59 60 61 62 63 64 65 66 67 68  | Next Page >