Search Results

Search found 15089 results on 604 pages for 'jeff home'.

Page 84/604 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • how to properly enable drag-and-drop in Windows 7 (with iTunes in particular)

    - by user38795
    On two computers with Windows 7 at work and at home with the same version of iTunes (10.1.0.56) at work I can drag-and-drop files into iPod, and I can not at home. At home only adding them first through a dialog box to a Library and only after that drag-and-dropping into iPod (or iPad) works. Initially I thought the problem is in iTunes, but looks like it's the same version, so most likely it's the Windows 7 config parameters. It's 64-bit W7, and I have admin privileges in both cases. The only difference I could think of is it's an enterprise version at work and ultimate at home. What should I change at home to enable that drag-and-drop-ing files into the application?

    Read the article

  • Personal | Going For A Long Drive

    - by Jeff Julian
    This weekend, we were planning on going to Mt. Rushmore, but with the weather the way it is, we decided to head south instead. So what are we going to do? A tour of different restaurants on the show Diners, Drive-ins, and Dives. Not very original I know since there are web sites and iPhone apps dedicated to locating the establishments, but it definitely sounds like it could be some fun. We are going to leave KC tonight and go through St. Louis, Memphis, Little Rock, Dallas, Oklahoma City, and back to KC. The kiddos are excited and we have plenty of movies, coloring books, etc in the car for the trip. This will be the first time we will get to use our turn around seats in the mini-van with our pull out table. I will have my laptop and phone if anything goes wrong with the site while I am gone and John will be back in KC as well. I hope to pushing some photos and reviews of the restaurants as we travel. Related Tags: blogging, Diners, Drive-ins, and Dives, Vacation

    Read the article

  • Why does Dropbox fail with a "dropboxd not found" error?

    - by Kevin
    This is a fresh install of Dropbox on a 12.04 server running on a 32bit system. It does not seem to fail the installation, but when I try to run it, I am told it can't find the file. Following the instructions on the Dropbox site, I get the following message: /home/kevin/.dropbox-dist/dropboxd: 10: exec: /home/kevin/.dropbox-dist/dropbox: not found Permissions currently being used: $ ls -l ~/.dropbox-dist/dropboxd -rwxrwxrwx 1 kevin kevin 258 Jun 13 20:40 /home/kevin/.dropbox-dist/dropboxd Has anyone had this problem and know a work around?

    Read the article

  • Windows 7 Software Installation : Siemens Step7 and Siemens SCL

    - by Shaharyar
    Hello sysadmins, We use Siemens Step7 and SCL for PLC programming in our company. I recently got to install Windows 7 and had yet to find out that the Siemens software suite doesn't install anymore. - Which is a pain... Checking through the setup files, I found following entries in the Setups.ini file: [OS] PlattformIDAllowed=1;2 WinXP=True WinXPExclude=0,Home,1 WinXPWarning=1 WinNETServer=True WinNETServerExclude=Home,0,1 WinNETServerWarning=Home,0,1 Win2003=1 Win2003Exclude=Home,0,1 Win2003Warning=Home,0,1 Win2003R2=1 Win2003R2Exclude=0,1 Win2003R2Warning=0,1 WinVista=1 Win2008=1 Win7=1 I added the last line to try things out and it still provides me the same error.. As far as I know the suite is installed by Install**Shield**. All help would be very appreciated!

    Read the article

  • Visual Studio Shortcut: Surround With

    - by Jeff Widmer
    I learned a new Visual Studio keyboard shortcut today that is really awesome; the “Surround With” shortcut.  You can trigger the Surround With context menu by pressing the Ctrl-K, Ctrl-S key combination when on a line of code. Ctrl-K, Ctrl-S means to hold down the Control key and then press K and then while still holding down the Control key press S. Here is where this comes in handy: You type a line of code and then realize you need to put it within an if statement block. So you type “if” and hit tab twice to insert the if statement code snippet.  Then you highlight the previous line of code that you typed, and then either drag and drop it into the if-then block or cut and paste it.  That is not too bad but it is a lot of extra key clicks and mouse moves. Now try the same with the Surround With keyboard shortcut.  Just highlight that line of code that you just typed and press Ctrl-K, Ctrl-S and choose the if statement code snippet, hit tab, and POW!... you are done!  No more code moving/indenting required. Here is what the Surround With context menu looks like: Just up or down arrow inside the drop down list to the code snippet that you want to surround your currently selected text with.  Did I mention this is AWESOME! Now it is so simple to surround lines of code with an if-then block or a try-catch-finally block... things that usually took several key clicks and maybe one or two mouse moves. And this works in both Visual Studio 2008 and Visual Studio 2010 which means it has been around for a long time and I never knew about it.   Technorati Tags: Visual Studio Keyboard Shortcut

    Read the article

  • Whats consuming HDD Space

    - by Umair Mustafa
    I have single partition of 92GB in which I installed Ubuntu 12.04. And for some Unknown reason a message pop ups saying that I only have 1GB of HDD space left. I ran command sudo du -hscx * on / and /home /home gave me this result 4.0K C:\nppdf32Log\debuglog.txt 0 convertedvideo.avi 176M Desktop 16K Documents 169M Downloads 4.0K examples.desktop 17M file.txt 4.0K Music 984K Pictures 4.0K Public 320K Red Hat 6.iso 2.5M syslog-ng_3.3.6.tar.gz 4.0K Templates 8.0K terminal.png 1.2M Thunderbird Attachments 698M ubuntu10.04LTS.iso 16K Ubuntu One 4.0K Untitled Folder 4.0K Videos 21G VirtualBox VMs 22G total And / gave me this result 81G home 0 initrd.img 0 initrd.img.old 833M lib 16K lost+found 68K media 4.0K mnt 260M opt du: cannot access `proc/8339/task/8339/fd/4': No such file or directory du: cannot access `proc/8339/task/8339/fdinfo/4': No such file or directory du: cannot access `proc/8339/fd/4': No such file or directory du: cannot access `proc/8339/fdinfo/4': No such file or directory 0 proc 640K root 908K run 8.6M sbin 4.0K selinux 4.0K srv 0 sys 148K tmp 3.3G usr 436M var 0 vmlinuz 0 vmlinuz.old 86G total If you look at the result returned by / it shows that /home is consuming 81GB but on the other hand /home returns only 22GB. I cant figure out whats consuming the HDD. I have not installed anything except Virtual Machines Perpetrator found using Disk Usage Analyzer

    Read the article

  • The ugly evolution of running a background operation in the context of an ASP.NET app

    - by Jeff
    If you’re one of the two people who has followed my blog for many years, you know that I’ve been going at POP Forums now for over almost 15 years. Publishing it as an open source app has been a big help because it helps me understand how people want to use it, and having it translated to six languages is pretty sweet. Despite this warm and fuzzy group hug, there has been an ugly hack hiding in there for years. One of the things we find ourselves wanting to do is hide some kind of regular process inside of an ASP.NET application that runs periodically. The motivation for this has always been that a lot of people simply don’t have a choice, because they’re running the app on shared hosting, or don’t otherwise have access to a box that can run some kind of regular background service. In POP Forums, I “solved” this problem years ago by hiding some static timers in an HttpModule. Truthfully, this works well as long as you don’t run multiple instances of the app, which in the cloud world, is always a possibility. With the arrival of WebJobs in Azure, I’m going to solve this problem. This post isn’t about that. The other little hacky problem that I “solved” was spawning a background thread to queue emails to subscribed users of the forum. This evolved quite a bit over the years, starting with a long running page to mail users in real-time, when I had only a few hundred. By the time it got into the thousands, or tens of thousands, I needed a better way. What I did is launched a new thread that read all of the user data in, then wrote a queued email to the database (as in, the entire body of the email, every time), with the properly formatted opt-out link. It was super inefficient, but it worked. Then I moved my biggest site using it, CoasterBuzz, to an Azure Website, and it stopped working. So let’s start with the first stupid thing I was doing. The new thread was simply created with delegate code inline. As best I can tell, Azure Websites are more aggressive about garbage collection, because that thread didn’t queue even one message. When the calling server response went out of scope, so went the magic background thread. Duh, all I had to do was move the thread to a private static variable in the class. That’s the way I was able to keep stuff running from the HttpModule. (And yes, I know this is still prone to failure, particularly if the app recycles. For as infrequently as it’s used, I have not, however, experienced this.) It was still failing, but this time I wasn’t sure why. It would queue a few dozen messages, then die. Running in Azure, I had to turn on the application logging and FTP in to see what was going on. That led me to a helper method I was using as delegate to build the unsubscribe links. The idea here is that I didn’t want yet another config entry to describe the base URL, appended with the right path that would match the routing table. No, I wanted the app to figure it out for you, so I came up with this little thing: public static string FullUrlHelper(this Controller controller, string actionName, string controllerName, object routeValues = null) { var helper = new UrlHelper(controller.Request.RequestContext); var requestUrl = controller.Request.Url; if (requestUrl == null) return String.Empty; var url = requestUrl.Scheme + "://"; url += requestUrl.Host; url += (requestUrl.Port != 80 ? ":" + requestUrl.Port : ""); url += helper.Action(actionName, controllerName, routeValues); return url; } And yes, that should have been done with a string builder. This is useful for sending out the email verification messages, too. As clever as I thought I was with this, I was using a delegate in the admin controller to format these unsubscribe links for tens of thousands of users. I passed that delegate into a service class that did the email work: Func<User, string> unsubscribeLinkGenerator = user => this.FullUrlHelper("Unsubscribe", AccountController.Name, new { id = user.UserID, key = _profileService.GetUnsubscribeHash(user) }); _mailingListService.MailUsers(subject, body, htmlBody, unsubscribeLinkGenerator); Cool, right? Actually, not so much. If you look back at the helper, this delegate then will depend on the controller context to learn the routing and format for the URL. As you might have guessed, those things were turning null after a few dozen formatted links, when the original request to the admin controller went away. That this wasn’t already happening on my dedicated server is surprising, but again, I understand why the Azure environment might be eager to reclaim a thread after servicing the request. It’s already inefficient that I’m building the entire email for every user, but going back to check the routing table for the right link every time isn’t a win either. I put together a little hack to look up one generic URL, and use that as the basis for a string format. If you’re wondering why I didn’t just use the curly braces up front, it’s because they get URL formatted: var baseString = this.FullUrlHelper("Unsubscribe", AccountController.Name, new { id = "--id--", key = "--key--" }); baseString = baseString.Replace("--id--", "{0}").Replace("--key--", "{1}"); Func unsubscribeLinkGenerator = user => String.Format(baseString, user.UserID, _profileService.GetUnsubscribeHash(user)); _mailingListService.MailUsers(subject, body, htmlBody, unsubscribeLinkGenerator); And wouldn’t you know it, the new solution works just fine. It’s still kind of hacky and inefficient, but it will work until this somehow breaks too.

    Read the article

  • Creating dynamic breadcrumb in asp.net mvc with mvcsitemap provider

    - by Jalpesh P. Vadgama
    I have done lots breadcrumb kind of things in normal asp.net web forms I was looking for same for asp.net mvc. After searching on internet I have found one great nuget package for mvpsite map provider which can be easily implemented via site map provider. So let’s check how its works. I have create a new MVC 3 web application called breadcrumb and now I am adding a reference of site map provider via nuget package like following. You can find more information about MVC sitemap provider on following URL. https://github.com/maartenba/MvcSiteMapProvid So once you add site map provider. You will find a Mvc.SiteMap file like following. And following is content of that file. <?xml version="1.0" encoding="utf-8" ?> <mvcSiteMap xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://mvcsitemap.codeplex.com/schemas/MvcSiteMap-File-3.0" xsi:schemaLocation="http://mvcsitemap.codeplex.com/schemas/MvcSiteMap-File-3.0 MvcSiteMapSchema.xsd" enableLocalization="true"> <mvcSiteMapNode title="Home" controller="Home" action="Index"> <mvcSiteMapNode title="About" controller="Home" action="About"/> </mvcSiteMapNode> </mvcSiteMap> So now we have added site map so now its time to make breadcrumb dynamic. So as we all know that with in the standard asp.net mvc template we have action link by default for Home and About like following. <div id="menucontainer"> <ul id="menu"> <li>@Html.ActionLink("Home", "Index", "Home")</li> <li>@Html.ActionLink("About", "About", "Home")</li> </ul> </div> Now I want to replace that with our sitemap provider and make it dynamic so I have added the following code. <div id="menucontainer"> @Html.MvcSiteMap().Menu(true) </div> That’s it. This is the magic code @Html.MvcSiteMap will dynamically create breadcrumb for you. Now let’s run this in browser. You can see that it has created breadcrumb dynamically without writing any action link code. So here you can see with MvcSiteMap provider we don’t have to write any code we just need to add menu syntax and rest it will do automatically. That’s it. Hope you liked it. Stay tuned for more till then happy programming.

    Read the article

  • #AJIReport 16 | Jason Bock on Windows Runtime and Metaprogramming

    - by Jeff Julian
    This episode we sit down with Jason Bock to talk about Windows Runtime and his upcoming book on Metaprogramming. Jason has been a consultant at Magenic for the past 11 years. In this show, Jason walks us through how to get started with Windows RT and talks about what the experience is like deploying to the Windows Store. We get into the new frontier of device development and the restrictions that are in place to protect the users and other applications. Towards the end of the show we start talking about Jason's book on Metaprogramming that he is co-authoring with Kevin Hazard. Listen to the Show Site: http://www.jasonbock.net/ Book: Metaprogramming in .NET Twitter: @JasonBock

    Read the article

  • A debugging experience with "highly compatible" ASP.NET 4.5

    - by Jeff
    I have to admit that I will pretty much upgrade software for no reason other than being on the latest version. I won't do it if it's super expensive (Adobe gets money from me about once every three or four years at best), but particularly with frameworks and stuff generally available as part of my MSDN subscription, I'll be bleeding edge. CoasterBuzz was running on the MVC 4 framework pretty much as soon as they did a "go live" license for it. I didn't really jump in head-first with Windows 8 and Visual Studio 2012, in part because I just wasn't interested in doing the reinstalls for each new version. Turns out there weren't that many revisions anyway. But when the final versions were released a week and a half ago, I jumped in. I saw on one of the Microsoft sites that .Net 4.5 was a "highly compatible in-place update" to the framework. Good enough for me. I was obviously running it by default in Windows 8, and installed it on my production server. I suppose it's "highly compatible," except when it isn't. Three of my sites are running with various flavors of the MVC version of POP Forums. All of them stopped working under ASP.NET 4.5. It was not immediately obvious what the problem might be beyond an exception indicating that there were no repository classes registered with Ninject, which I use for dependency injection in the forums. This was made all the more weird by the fact that it ran fine locally in the dev Web host. My first instinct was to spin up a Windows Server VM on my local box and put the remote debugger on it. (Side note: running multiple VM's on a Retina MacBook Pro with 16 gigs of RAM is pretty much the most awesome thing ever. I can't believe this computer is for real, and not a 50-pound tower under my desk.) What might have been going on in IIS that doesn't happen in Visual Studio? In the debugging process, I realized that I might be looking in the wrong place. POP Forums creates a Ninject container using a method called from a PreApplicationStartMethod attribute, and at that time registers a module (what Ninject uses to map interfaces to implementations) that maps all of the core dependencies. It also creates an instance of an HttpModule that originally hosted the "services" (search indexing, mailer, etc.), but now just records errors. That's all well and good, but the actual repository mapping, where data is actually read or persisted, happens in Application_Start() in global.asax. The idea there is that you can swap out the SqlSingleWebServer repos for something tuned for multiple servers, Oracle or something else. Of course, if I used something like StructureMap, which does convention-based mapping for dependency injection (a class implementing ISettingsRepository called SettingsRepository is automagically mapped), I wouldn't have to worry about it. In any case, the HttpModule, being instantiated before Application_Start() gets to run, would throw because there was no repo mapped where it could get settings from the database. This makes total sense. The fix is sort of a hack, where I don't setup the innards of the HttpModule until a call to its BeginRequest is made. I say it's a hack, because its primary function, logging exceptions, won't work until the app has warmed up. Still, this brings up an interesting question about the race condition, and what changed in 4.5 when it's running in IIS. In ASP.NET 4, it would appear that the code called via the PreApplicationStartMethod was either failing silently, and running again later, or it was getting to that code after Application_Start was called. In any case, weird thing. The real pain point I'm experiencing now is a bug in MVC 4 that is extremely serious because it renders the mobile/alternate view functionality very much broken.

    Read the article

  • Aptana Under linux

    - by fatnjazzy
    Hey, I downloaded the Aptanastudio 2.0 and unzipped it in the desktop. Im trying to run Aptana studio 2.0 under OpenSuse 11 and i get the following error... Any idea y? Thanks JVM terminated. Exit code=-1 -Xms40m -Xmx384m -Djava.awt.headless=true -XX:MaxPermSize=256m -Djava.class.path=/home/avi/Desktop/Aptana Studio 2.0/plugins/org.eclipse.equinox.launcher_1.0.200.v20090520.jar -os linux -ws gtk -arch x86 -showsplash -launcher /home/avi/Desktop/Aptana Studio 2.0/AptanaStudio -name AptanaStudio --launcher.library /home/avi/Desktop/Aptana Studio 2.0/plugins/org.eclipse.equinox.launcher.gtk.linux.x86_1.0.200.v20090520/eclipse_1206.so -startup /home/avi/Desktop/Aptana Studio 2.0/plugins/org.eclipse.equinox.launcher_1.0.200.v20090520.jar -application com.aptana.ide.desktop.integration.Application -vm /usr/lib/jvm/java-1.6.0-openjdk-1.6.0/jre/bin/../lib/i386/client/libjvm.so -vmargs -Xms40m -Xmx384m -Djava.awt.headless=true -XX:MaxPermSize=256m -Djava.class.path=/home/avi/Desktop/Aptana Studio 2.0/plugins/org.eclipse.equinox.launcher_1.0.200.v20090520.jar

    Read the article

  • Local Events | Azure Bootcamp

    - by Jeff Julian
    Coming to Kansas City April 8th and 9th is the Microsoft Azure Bootcamp. This event looks very promising for those developers who are looking into Azure for themselves or their companies. It covers the wide range of topics required to understand what Azure really is and is not. Space is limited so if you are considering Azure, register for this event today.Agenda:Module 1: Introduction to cloud computer and AzureHow it worksKey ScenariosThe development environment and SDKModule 2: Using Web RolesBasic ASP.NETBasic configurationModule 3: Blobs: File Storage in the cloudModule 4: Tables: Scalable hierarchical storageModule 5: Queues: Decoupling your systemsModule 6: Basic Worker RolesExecuting backend processesConsuming a queueLeveraging local storageModule 7: Advanced Worker RolesExternal EndpointsInter-role communicationModule 8: Building a business with AzureUsing Azure as an ISV or a partnerAdvantages to delivering valueBPOSPricingModule 9: SQL AzureSetting it upSQL Azure firewallRemote managementMigrating dataModule 10: AppFabricService BusAccess Control SystemIdentity in the cloudModule 11: Cloud ScenariosApp migration strategiesDisposable computingDynamic scaleShuntingPrototypingMultitenant applications (This is my second attempt at this post after MacJournal decided to crash and not save my work. Authoring tools all need auto-save features by now, that is a requirement set in stone by Microsoft Word 97) Related Tags: Azure, Microsoft, Kansas City

    Read the article

  • .NET development on a “Retina” MacBook Pro

    - by Jeff
    The rumor that Apple would release a super high resolution version of its 15” laptop has been around for quite awhile, and one I watched closely. After more than three years with a 17” MacBook Pro, and all of the screen real estate it offered, I was ready to replace it with something much lighter. It was a fantastic machine, still doing 6 or 7 hours after 460 charge cycles, but I wanted lighter and faster. With the SSD I put in it, I was able to sell it for $750. The appeal of higher resolution goes way back, when I would plug into a projector and scale up. Consolas, as it turns out, is a nice looking font for code when it’s bigger. While I have mostly indifference for iOS, I have to admit that a higher dot pitch on the iPhone and iPad is pretty to look at. So I ordered the new 15” “Retina” model as soon as the Apple Store went live with it, and got it seven days later. I’ve been primarily using Parallels as my VM of choice from OS X for about five years. They recently put out an update for compatibility with the display, though I’m not entirely sure what that means. I figured there would have to be some messing around to get the VM to look right. The combination that seems to work best is this: Set the display in OS X to “more room,” which is roughly the equivalent of the 1920x1200 that my 17” did. It’s not as stunning as the text at the default 1440x900 equivalent (in OS X), but it’s still quite readable. Parallels still doesn’t entirely know what to do with the high resolution, though what it should do is somehow treat it as native. That flaw aside, I set the Windows 7 scaling to 125%, and it generally looks pretty good. It’s not really taking advantage of the display for sharpness, but hopefully that’s something that Parallels will figure out. Screen tweaking aside, I got the base model with 16 gigs of RAM, so I give the VM 8. I can boot a Windows 7 VM in 9 seconds. Nine seconds! The Windows Experience Index scores are all 7 and above, except for graphics, which are both at 6. Again, that’s in a VM. It’s hard to believe there’s something so fast in a little slim package like that. Hopefully this one gets me at least three years, like the last one.

    Read the article

  • Ryan Weber On KCNext | #AJIReport

    - by Jeff Julian
    We sit down with Ryan Weber of KCNext in our office to talk about the Kansas City market for technology. The Technology Council of Greater Kansas City is committed to growing the existing base of technology firms, recruiting and attracting technology companies, aggregating and promoting our regional IT assets and providing peer interaction and industry news. During this show we talk about why KCNext is great for Kansas City. They offer some great networking and educational events, but also focus on connecting companies together to help build relationships on a business level. Make sure you visit their website to see what events are coming up and link up with them on Twitter to stay on top of news from the KC technology community. Listen to the Show Site: http://www.kcnext.com/ Twitter: @KCNext LinkedIn: KCNext - The Technology Council of Greater Kansas City

    Read the article

  • AJI Report 14 &ndash; Brian Lagunas on XAML and Windows 8

    - by Jeff Julian
    We sat down with Brian at the Iowa Code Camp to talk about his sessions, WPF, Application Design, and what Infragistics has to offer developers. Infragistics is a huge supporter of regional events like Iowa Code Camp and we want to thank them for their support of the Midwest region. Brian is a sharp guy and it was great to meet him and learn more about what makes him tick. Brian Lagunas is an INETA Community Speaker, co-leader of the Boise .Net Developers User Group (NETDUG), and original author of the Extended WPF Toolkit. He is a multi-recipient of the Microsoft Community Contributor Award and can be found speaking at a variety of user groups and code camps around the nation. Brian currently works at Infragistics as a Product Manager for the award winning NetAdvantage for WPF and Silverlight components. Before geeking out, Brian served his country in the United States Army as an infantryman and later served his local community as a deputy sheriff.   Listen to the Show   Site: http://brianlagunas.com Twitter: @BrianLagunas

    Read the article

  • Wildcards not being substituted

    - by user21463
    #!/bin/bash loc=`echo ~/.gvfs/*/DCIM/100_FUJI` rm -f /mnt/fujifilmA100 ln -s "$loc" /mnt/fujifilmA100 For some reason the variable * doesn't get substituted with the only possible value and gets given the value /home/chris/.gvfs/*/DCIM/100_FUJI. Does anyone have an idea of why? Please note: If global expansion fails, the pattern is not substituted. I ran the commands: chris@comp2008:~$ loc=`echo ~/.gvfs/*/DCIM/100_FUJI ` chris@comp2008:~$ echo $loc /home/chris/.gvfs/gphoto2 mount on usb%3A001,008/DCIM/100_FUJI So we can see the expansion should work I have now switched to using: loc = `find ~/.gvfs -name 100_FUJI ` I am just curious why it doesn't work as is. Debugging output using sh -x echo /home/chris/.gvfs/*/DCIM/100_FUJI loc=/home/chris/.gvfs/*/DCIM/100_FUJI rm -f /mnt/fujifilmA100 ln -s /home/chris/.gvfs/*/DCIM/100_FUJI/mnt/fujifilmA100

    Read the article

  • AJI Report with Nat Ryan&ndash;Discussion about Game Development with Corona Labs SDK

    - by Jeff Julian
    We sat down with Nat Ryan of Fully Croisened to talk about Game Development and the Corona Labs framework. The Corona SDK is a platform that allows you to write mobile games or applications using the Lua language and deploy to the iOS and Android platforms. One of the great features of Corona is the compilation output is a native application and not a hybrid application. Corona is very centered around their developer community and there are quite a few local meetups focused on the helping other developers use the platform. The community and Corona site offers a great number of resources and samples that will help you get started in a matter of a few days. If you are into Game Development and want to move towards mobile, or a business developer looking to turn your craft back into a hobby, check out this recording and Corona Labs to get started.   Download the Podcast   Site: AJI Report – @AJISoftware Site: Fully Croisened Twitter: @FullyCroisened Site: Corona Labs

    Read the article

  • Compiling Gnucash 2.6.3 in Ubuntu 14.04

    - by wolveryn
    Downloaded the debian file from source forge and followed instructions, where these errors appear, I re-downloaded the file several times with same error. I want to install the latest Gnucash not the one available on software center. Thank you for your support. /qof/gnc-date/qof print date dmy buff: There are some differences between distros in the way they namelocales, and this can cause trouble with the locale-basedformatting. If you get the assert in this function, run locale -aand make sure that en_US, en_GB, and fr_FR are installed and thatif a suffix is needed it's in the suffixes array.** ERROR:test-gnc-date.c:465:test_gnc_setlocale: code should not be reached FAIL GTester: last random seed: R02Sd8d3d0e67be954baa8ec75d81a14c0e3 /bin/bash: line 1: 18889 Terminated MALLOC_CHECK_=2 MALLOC_PERTURB_=$((${RANDOM:-256} % 256)) gtester --verbose test-qof make[5]: *** [test-nonrecursive] Error 143 make[5]: Leaving directory `/home/ahmed/gnucash/gnucash-2.6.3/src/libqof/qof/test' make[4]: *** [check-am] Error 2 make[4]: Leaving directory `/home/ahmed/gnucash/gnucash-2.6.3/src/libqof/qof/test' make[3]: *** [check-recursive] Error 1 make[3]: Leaving directory `/home/ahmed/gnucash/gnucash-2.6.3/src/libqof/qof' make[2]: *** [check-recursive] Error 1 make[2]: Leaving directory `/home/ahmed/gnucash/gnucash-2.6.3/src/libqof' make[1]: *** [check-recursive] Error 1 make[1]: Leaving directory `/home/ahmed/gnucash/gnucash-2.6.3/src' make: *** [check-recursive] Error 1

    Read the article

  • Personal | First Stop on our trip, St. Louis

    - by Jeff Julian
    St. Louis is definitely a cool city. I have always looked at it as Kansas City’s big brother. I love to Arch, wonder what is would be like to have pro hockey, really like the downtown area, and have some great friends who live there. The reason we left for St. Louis on Thursday evening was to get us a head start on our journey. Since we were doing a Diners, Drive-ins, and Dives tour, it made since to have the journey start there. We picked the Hyatt Downtown as our hotel because they had an Arch Package which was suppose to get you tickets to the arch so you didn’t need to arrive early and wait in line. That ended up not working cause the arch had been selling out every day and they were no longer accepting the hotels tickets. No biggie and the hotel did try very hard to get us tickets, but we just took our chances in the line and waited. We walked over to the park and had to wait for about 20 minutes for the doors to open and had tickets after another 20 minutes of waiting in line and at that point walked right up and were able to get to the elevators.I want to stop here to have a little aside. I don’t know who started the rumor that the arch ride is scary but it is not. You do sit in a small pod, but it like the accent on a roller coaster to the top of the first drop and an elevator with no windows outside. Nothing to be afraid of here if you aren’t claustrophobic. If you are afraid of small spaces, stay clear of this ride. Once you get to the top, you walk up 10 to 30 stairs depending on which car you were in (lower the number the less stairs you climb) and you are then at the top in a decent sized room where you look out the windows. Beautiful view of the city. I don’t typically like heights, but this felt like being inside a building and not hang out on a roof. Here is the view from the arch: Related Tags: Diners, Drive-ins, and Dives, St. Louis, Vacation

    Read the article

  • Html.RenderAction Failed when Validation Failed

    - by Shaun
    RenderAction method had been introduced when ASP.NET MVC 1.0 released in its MvcFuture assembly and then final announced along with the ASP.NET MVC 2.0. Similar as RenderPartial, the RenderAction can display some HTML markups which defined in a partial view in any parent views. But the RenderAction gives us the ability to populate the data from an action which may different from the action which populating the main view. For example, in Home/Index.aspx we can invoke the Html.RenderPartial(“MyPartialView”) but the data of MyPartialView must be populated by the Index action of the Home controller. If we need the MyPartialView to be shown in Product/Create.aspx we have to copy (or invoke) the relevant code from the Index action in Home controller to the Create action in the Product controller which is painful. But if we are using Html.RenderAction we can tell the ASP.NET MVC from which action/controller the data should be populated. in that way in the Home/Index.aspx and Product/Create.aspx views we just need to call Html.RenderAction(“CreateMyPartialView”, “MyPartialView”) so it will invoke the CreateMyPartialView action in MyPartialView controller regardless from which main view. But in my current project we found a bug when I implement a RenderAction method in the master page to show something that need to connect to the backend data center when the validation logic was failed on some pages. I created a sample application below.   Demo application I created an ASP.NET MVC 2 application and here I need to display the current date and time on the master page. I created an action in the Home controller named TimeSlot and stored the current date into ViewDate. This method was marked as HttpGet as it just retrieves some data instead of changing anything. 1: [HttpGet] 2: public ActionResult TimeSlot() 3: { 4: ViewData["timeslot"] = DateTime.Now; 5: return View("TimeSlot"); 6: } Next, I created a partial view under the Shared folder to display the date and time string. 1: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<dynamic>" %> 2:  3: <span>Now: <% 1: : ViewData["timeslot"].ToString() %></span> Then at the master page I used Html.RenderAction to display it in front of the logon link. 1: <div id="logindisplay"> 2: <% 1: Html.RenderAction("TimeSlot", "Home"); %> 3:  4: <% 1: Html.RenderPartial("LogOnUserControl"); %> 5: </div> It’s fairly simple and works well when I navigated to any pages. But when I moved to the logon page and click the LogOn button without input anything in username and password the validation failed and my website crashed with the beautiful yellow page. (I really like its color style and fonts…)   How ASP.NET MVC executes Html.RenderAction In this example all other pages were rendered successful which means the ASP.NET MVC found the TimeSolt action under the Home controller except this situation. The only different is that when I clicked the LogOn button the browser send an HttpPost request to the server. Is that the reason of this bug? I created another action in Home controller with the same action name but for HttpPost. 1: [HttpPost] 2: [ActionName("TimeSlot")] 3: public ActionResult TimeSlot(object dummy) 4: { 5: return TimeSlot(); 6: } Or, I can use the AcceptVerbsAttribute on the TimeSlot action to let it allow both HttpGet and HttpPost. 1: [AcceptVerbs("GET", "POST")] 2: public ActionResult TimeSlot() 3: { 4: ViewData["timeslot"] = DateTime.Now; 5: return View("TimeSlot"); 6: } And then repeat what I did before and this time it worked well. Why we need the action for HttpPost here as it’s just data retrieving? That is because of how ASP.NET MVC executes the RenderAction method. In the source code of ASP.NET MVC we can see when proforming the RenderAction ASP.NET MVC creates a RequestContext instance from the current RequestContext and created a ChildActionMvcHandler instance which inherits from MvcHandler class. Then the ASP.NET MVC processes the handler through the HttpContext.Server.Execute method. That means it performs the action as a stand-alone request asynchronously and flush the result into the  TextWriter which is being used to render the current page. Since when I clicked the LogOn the request was in HttpPost so when ASP.NET MVC processed the ChildActionMvcHandler it would find the action which allow the current request method, which is HttpPost. Then our TimeSlot method in HttpGet would not be matched.   Summary In this post I introduced a bug in my currently developing project regards the new Html.RenderAction method provided within ASP.NET MVC 2 when processing a HttpPost request. In ASP.NET MVC world the underlying Http information became more important than in ASP.NET WebForm world. We need to pay more attention on which kind of request it currently created and how ASP.NET MVC processes.   Hope this helps, Shaun   All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • VirtualHost and 403 Forbidden problem with Apache

    - by Joaquín L. Robles
    Hi people, I have the exactly same problem as in 403 Forbidden Error when accessing enabled virtual host, but I tried the provided solution and still receiving 403 Forbidden, my site is called project, and it's cofiguration file in /etc/apache2/sites-available is the following: <VirtualHost *:80> ServerAdmin webmaster@reweb ServerName www.online.project.com ErrorLog /logs/project-errors.log DocumentRoot /home/joarobles/Zend/workspaces/DefaultWorkspace7/online.project.com.ar/public <Directory /home/joarobles/Zend/workspaces/DefaultWorkspace7/online.project.com.ar> Order Deny,Allow Allow from all Options Indexes </Directory> Tried to change the owner of /home/joarobles/Zend/workspaces/DefaultWorkspace7/online.cobico.com.ar to www:data:www-data with recursion, but still getting this error in project-errors.log: [Mon Jan 31 01:26:01 2011] [crit] [client 127.0.0.1] (13)Permission denied: /home/joarobles/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Any idea? Thanks!

    Read the article

  • Can't get wireless on macbook pro 8,2

    - by Jeff
    I'm a linux Newb, and I have tried several of the fixes listed to try and get my wifi drivers to work, but to no avail. Does anyone here know why this isn't working for me, or better yet, how to fix it? Under lspci -vvv I get the following output: 03:00.0 Network controller: Broadcom Corporation BCM4331 802.11a/b/g/n (rev 02) Subsystem: Apple Inc. AirPort Extreme Control: I/O- Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast TAbort- SERR- Kernel modules: bcma With sudo lshw -class network I get this output: *-network UNCLAIMED description: Network controller product: BCM4331 802.11a/b/g/n vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:03:00.0 version: 02 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list configuration: latency=0 resources: memory:b0600000-b0603fff Any help would be greatly appreciated!

    Read the article

  • Site too large to officially use Google Analytics?

    - by Jeff Atwood
    We just got this email from the Google Analytics team: We love that you love our product and use it as much as you do. We have observed however, that a website you are tracking with Google Analytics is sending over 1 million hits per day to Google Analytics servers. This is well above the "5 million pageviews per month per account" limit specified in the Google Analytics Terms of Service. Processing this amount of data multiple times a day takes up valuable resources that enable us to continue to develop the product for all Google Analytics users. Processing this amount of data multiple times a day takes up valuable resources that enable us to continue to develop the product for all Google Analytics users. As such, starting August 23rd, 2010, the metrics in your reports will be updated once a day, as opposed to multiple times during the course of the day. You will continue to receive all the reports and features in Google Analytics as usual. The only change will be that data for a given day will appear the following day. We trust you understand the reasons for this change. I totally respect this decision, and I think it's very generous to not kick us out. But how do we do this the right way -- what's the official, blessed Google way to use Google Analytics if you're a "whale" website with lots of hits per day? Or, are there other analytics services that would be more appropriate for very large websites?

    Read the article

  • Lessons from rewriting POP Forums for MVC, open source-like

    - by Jeff
    It has been a ton of work, interrupted over the last two years by unemployment, moving, a baby, failing to sell houses and other life events, but it's really exciting to see POP Forums v9 coming together. I'm not even sure when I decided to really commit to it as an open source project, but working on the same team as the CodePlex folks probably had something to do with it. Moving along the roadmap I set for myself, the app is now running on a quasi-production site... we launched MouseZoom last weekend. (That's a post-beta 1 build of the forum. There's also some nifty Silverlight DeepZoom goodness on that site.)I have to make a point to illustrate just how important starting over was for me. I started this forum thing for my sites in old ASP more than ten years ago. What a mess that stuff was, including SQL injection vulnerabilities and all kinds of crap. It went to ASP.NET in 2002, but even then, it felt a little too much like script. More than a year later, in 2003, I did an honest to goodness rewrite. If you've been in this business of writing code for any amount of time, you know how much you hate what you wrote a month ago, so just imagine that with seven years in between. The subsequent versions still carried a fair amount of crap, and that's why I had to start over, to make a clean break. Mind you, much of that crap is still running on some of my production sites in a stable manner, but it's a pain in the ass to maintain.So with that clean break, there is much that I have learned. These are a few of those lessons, in no particular order...Avoid shiny object syndromeOver the years, I've embraced new things without bothering to ask myself why. I remember spending the better part of a year trying to adapt this app to use the membership and profile API's in ASP.NET, just because they were there. They didn't solve any known problem. Early on in this version, I dabbled in exotic ORM's, even though I already had the fundamental SQL that I knew worked. I bloated up the client side code with all kinds of jQuery UI and plugins just because, and it got in the way. All the new shiny can be distracting, and I've come to realize that I've allowed it to be a distraction most of my professional life.Just query what you needI've spent a lot of time over-thinking how to query data. In the SQL world, this means exotic joins, special caches, the read-update-commit loop of ORM's, etc. There are times when you have to remind yourself that you aren't Facebook, you'll never be Facebook, and that databases are in fact intended to serve data. In a lot of projects, back in the day, I used to have these big, rich data objects and pass them all over the place, through various application tiers, when in reality, all I needed was some ID from the entity. I try to be mindful of how many queries hit the database on a given request, but I don't obsess over it. I just get what I need.Don't spend too much time worrying about your unit testsIf you've looked at any of the tests for POP Forums, you might offer an audible WTF. That's OK. There's a whole lot of mocking going on. In some cases, it points out where you're doing too much, and that's good for improving your design. In other cases it shows where your design sucks. But the biggest trap of unit testing is that you worry it should be prettier. That's a waste of time. When you write a test, in many cases before the production code, the important part is that you're testing the right thing. If you have to mock up a bunch of stuff to test the outcome, so be it, but it's not wasted time. You're still doing up the typical arrange-action-assert deal, and you'll be able to read that later if you need to.Get back to your HTTP rootsASP.NET Webforms did a reasonably decent job at abstracting us away from the stateless nature of the Web. A lot of people criticize it, but I think it all worked pretty well. These days, with MVC, jQuery, REST services, and what not, we've gone back to thinking about the wire. The nuts and bolts passing between our Web browser and server matters. This doesn't make things harder, in my opinion, it makes them easier. There is something incredibly freeing about how we approach development of Web apps now. HTTP is a really simple protocol, and the stuff we push through it, in particular HTML and JSON, are pretty simple too. The debugging points are really easy to trap and trace.Premature optimization is prematureI'll go back to the data thing for a moment. I've been known to look at a particular action or use case and stress about the number of calls that are made to the database. I'm not suggesting that it's a bad thing to keep these in mind, but if you worry about it outside of the context of the actual impact, you're wasting time. For example, I query the database for last read times in a forum separately of the user and the list of forums. The impact on performance barely exists. If I put it under load, exceeding the kind of load I expect, it still barely has an impact. Then consider it only counts for logged in users. The context of this "inefficient" action is that it doesn't matter. Did I mention I won't be Facebook?Solve your own problems firstThis is another trap I've fallen into. I've often thought about what other people might need for some feature or aspect of the app. In other words, I was willing to make design decisions based on non-existent data. How stupid is that? When I decided to truly open source this thing, building for myself first was a stated design goal. This app has to server the audiences of CoasterBuzz, MouseZoom and other sites first. In this development scenario, you don't have access to mountains of usability studies or user focus groups. You have to start with what you know.I'm sure there are other points I could make too. It has been a lot of fun to work on, and I look forward to evolving the UI as time goes on. That's where I hope to see more magic in the future.

    Read the article

  • Solving Euler Project Problem Number 1 with Microsoft Axum

    - by Jeff Ferguson
    Note: The code below applies to version 0.3 of Microsoft Axum. If you are not using this version of Axum, then your code may differ from that shown here. I have just solved Problem 1 of Project Euler using Microsoft Axum. The problem statement is as follows: If we list all the natural numbers below 10 that are multiples of 3 or 5, we get 3, 5, 6 and 9. The sum of these multiples is 23. Find the sum of all the multiples of 3 or 5 below 1000. My Axum-based solution is as follows: namespace EulerProjectProblem1{ // http://projecteuler.net/index.php?section=problems&id=1 // // If we list all the natural numbers below 10 that are multiples of 3 or 5, we get 3, 5, 6 and 9. // The sum of these multiples is 23. // Find the sum of all the multiples of 3 or 5 below 1000. channel SumOfMultiples { input int Multiple1; input int Multiple2; input int UpperBound; output int Sum; } agent SumOfMultiplesAgent : channel SumOfMultiples { public SumOfMultiplesAgent() { int Multiple1 = receive(PrimaryChannel::Multiple1); int Multiple2 = receive(PrimaryChannel::Multiple2); int UpperBound = receive(PrimaryChannel::UpperBound); int Sum = 0; for(int Index = 1; Index < UpperBound; Index++) { if((Index % Multiple1 == 0) || (Index % Multiple2 == 0)) Sum += Index; } PrimaryChannel::Sum <-- Sum; } } agent MainAgent : channel Microsoft.Axum.Application { public MainAgent() { var SumOfMultiples = SumOfMultiplesAgent.CreateInNewDomain(); SumOfMultiples::Multiple1 <-- 3; SumOfMultiples::Multiple2 <-- 5; SumOfMultiples::UpperBound <-- 1000; var Sum = receive(SumOfMultiples::Sum); System.Console.WriteLine(Sum); System.Console.ReadLine(); PrimaryChannel::ExitCode <-- 0; } }} Let’s take a look at the various parts of the code. I begin by setting up a channel called SumOfMultiples that accepts three inputs and one output. The first two of the three inputs will represent the two possible multiples, which are three and five in this case. The third input will represent the upper bound of the problem scope, which is 1000 in this case. The lone output of the channel represents the sum of all of the matching multiples: channel SumOfMultiples{ input int Multiple1; input int Multiple2; input int UpperBound; output int Sum;} I then set up an agent that uses the channel. The agent, called SumOfMultiplesAgent, received the three inputs from the channel sent to the agent, stores the results in local variables, and performs the for loop that iterates from 1 to the received upper bound. The agent keeps track of the sum in a local variable and stores the sum in the output portion of the channel: agent SumOfMultiplesAgent : channel SumOfMultiples{ public SumOfMultiplesAgent() { int Multiple1 = receive(PrimaryChannel::Multiple1); int Multiple2 = receive(PrimaryChannel::Multiple2); int UpperBound = receive(PrimaryChannel::UpperBound); int Sum = 0; for(int Index = 1; Index < UpperBound; Index++) { if((Index % Multiple1 == 0) || (Index % Multiple2 == 0)) Sum += Index; } PrimaryChannel::Sum <-- Sum; }} The application’s main agent, therefore, simply creates a new SumOfMultiplesAgent in a new domain, prepares the channel with the inputs that we need, and then receives the Sum from the output portion of the channel: agent MainAgent : channel Microsoft.Axum.Application{ public MainAgent() { var SumOfMultiples = SumOfMultiplesAgent.CreateInNewDomain(); SumOfMultiples::Multiple1 <-- 3; SumOfMultiples::Multiple2 <-- 5; SumOfMultiples::UpperBound <-- 1000; var Sum = receive(SumOfMultiples::Sum); System.Console.WriteLine(Sum); System.Console.ReadLine(); PrimaryChannel::ExitCode <-- 0; }} The result of the calculation (which, by the way, is 233,168) is sent to the console using good ol’ Console.WriteLine().

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >