Search Results

Search found 13577 results on 544 pages for 'the great gonzo'.

Page 265/544 | < Previous Page | 261 262 263 264 265 266 267 268 269 270 271 272  | Next Page >

  • CPU clock scales down so computer is unusable after switching to battery

    - by Ryan
    When I am plugged in my laptop runs great however when I unplug it and I'm on battery power my CPU clock speed scales down pretty much all the way. I know this is happening by monitoring the clock speed. When plugged in it will usually stay between 1000MHz and 3000MHz but when I unplug it it quickly scales down to less than 500MHz and will get as low as 100MHz and it will NEVER scale up at all on battery power. After I plug the power back in it will then begin operating normally in about a minute. I have tried setting the MIN and MAX CPU performance in power options to 100% and have tried messing around with cooling settings which seemed to be a problem with HP laptops. I have a Toshiba Satellite M500-ST6444 running Windows 7. The BIOS is up to date. I have tried two versions.

    Read the article

  • How to cut the line between quality and time?

    - by m3th0dman
    On one hand, I have been taught by various software engineering books ([1] as example) that my job as a programmer is to make the best possible software: great design, flexibility, to be easily maintained etc. One the other hand although I realize that I actually write software for money and not for entertainment, although is very nice to write good code and plan ahead and refactor after writing and ... I wonder if it is always best for the business (after all we should be responsible). Is the business always benefiting from a best code? Maybe I'm over-engineering something, and it's not always useful? So how should I know when to stop in the process to achieving the best possible code? I am sure that experience is something that makes a difference here, but I believe this cannot be the only answer. [1] Uncle Bob's in Clean Code says at page 6 about the fact that: They [managers] may defend the schedule and requirements with passion; but that’s their job. It’s your job to defend the code with equal passion.

    Read the article

  • Listing Desktop Apps in the Windows 8 Store

    - by David Paquette
    So it looks like Microsoft will be allowing publishers to list their desktop apps in the Windows 8 Store.  As per the developer agreement: b. Desktop App Submission. You may submit an app description for one or more desktop apps to the Windows Store. Notwithstanding anything else in this agreement, you understand that Microsoft will not offer any desktop apps through the Store and only Windows Store apps are made available through the Windows Store. Microsoft may, but is not required to, list the desktop app in the Windows Store together with a link you provide, to a website where users can acquire the app. You are solely responsible and agree to maintain that website and provide an updated link to Microsoft if the url changes. Desktop apps are any apps built using APIs other than the APIs for Windows Store apps that run on Windows 8. As the agreement states, Microsoft will not distribute desktop apps through the store, but they will provide a link to a website where users can download your desktop app.  If nothing else, it is a great way to advertise your app.  Hopefully this doesn’t cause any confusion with consumers.  Will end users understand why some apps install automatically while others just send you to a website?

    Read the article

  • All files gone after running fsck. How can I recover my files?

    - by cinlung
    I am a newbie in Linux. So this is my story I installed Ubuntu server 10.04lts. It worked great for many months, until today i decided to run fsck on the system partition and although it warned me, I kept pressing yes and now it will only boot into grub prompt. So i read some article and tried grub reinstall. But before performing grub reinstall, i decided to run fsck again from Ubuntu 10.04 lts for desktop live CD. The fsck painfully passes, now my drive is recognized as ext4 system and I am able to mount it again. However, all i can see is just boot directory and lost&found. I tried to perform grub reinstalling by doing grup-install stuff, now my grub is still not loading right, my files are missing, and the weird thing is that the amount I found used by boot and lost n found is only 5gb and the amount used in he hdd is 8 gb. So my files must be somewhere in the hdd. Is there any sinple way maybe a windows tool or something yo recover my files? I only need to retrieve my database backup and everything else can go. I am freaking out here. Please help.

    Read the article

  • End of Public Updates for Java SE 6

    - by Tori Wieldt
    It's important for developers and systems administrators to either make the transition over to Java SE 7 or to work with Oracle to get updates via the Java SE Support program. Have you updated to Java SE 7? Along with great features (Fork/Join, NIO, Project Coin), Java SE 7 is being updated and patched regularly. Java SE 7 has been out for over a year and is ready to download.  The last publicly available release of Oracle JDK 6 is to be released in February, 2013. This means that after 19 February 2013, all new security updates, patches and fixes for Java SE 6 and Java SE 5 will only be available through My Oracle Support and will thus require a commercial license with Oracle.    In the event you are not ready to migrate to Java SE 7, Oracle offers: Java SE Support for continued access to critical bug fixes and security fixes as well as general maintenance for JDK 6. Additionally, Java SE Advanced and Suite offers superior diagnostics and manageability tools that minimize the costs of deployment, monitoring and maintenance of Java-based IT environments. The Java SE Support Roadmap reflects an updated timeline for the End of Public Updates for JDK 6. The End of Public Updates date has been extended from November 2012 to February 2013, to allow some more time for the transition to JDK 7. Older releases of Java SE 6 will still be available on the Java SE archive, but will require a commercial license with Oracle for any new security updates, patches and fixes.  Th End of Public Updates for Java SE 6 will not impact the usage, availability, patching of Java SE 6 used for Fusion Middleware 11g and 12c. The support schedule for Java SE used for and in Fusion Middleware is not impacted by this announcement. For More Information Visit the Java SE page on Oracle.com.

    Read the article

  • AuthInfoRequired cups overwrites

    - by mooscape
    My problem is basically identical to the following: http://bbs.archlinux.org/viewtopic.php?id=61826 Put simply, I have a machine in ubuntu trying to connect to another ubuntu machine via a network in order to use the printer attached. There is no problem printing until I restart the guest machine. Immediately it overwrites the printers.conf file (under /etc/cups/printers.conf). It always adds the same line: AuthInfoRequired username,password I stop cups and change it to *#*AuthInfoRequired username,password to comment out the command. Start cups. Works great 'til the next shutdown. Then it gets overwritten again. Googling indicates it may be GTK problem and not CUPS, but I have found no permanent solution to date. Any suggestions appreciated ....

    Read the article

  • Calling a model from a controller from the 404 route [migrated]

    - by IrishRob
    Got a problem here where I can’t seem to load a method from a model after the page has been redirected after encountering a 404. Model name: Category_Model Method name: get_category_menu() In my routes, I’ve updated the 404 over-ride to: $route[‘404_override’] = ‘whoops’; I’ve also got my controller Whoops that reads… <?php class Whoops extends CI_Controller { function index() { $this->load->model('Category_Model'); $data['Categories'] = $this->Category_Model->get_category_menu(); $data['main_content'] = $this->load->view('messages/whoops', null, true); $this->load->view('includes/template', $data); } } So when I navigate to a page that doesn’t exist, I get the following error… Message: Undefined property: Whoops::$Category_Model Filename: controllers/whoops.php I’ve hard coded the loading of the model into the controller here, even though I have it in my autoload, but no luck. Everything else with the site so far works, just this 404 problem. Any pointers would be great, kinda new to CI so go easy on me. Cheers.

    Read the article

  • ShoreTel 230 phone calls from website

    - by Michael Irey
    Our company uses these fancy ShoreTel 230 phones. We make many phone calls from our custom built web based contact management system. It would be nice if our employees could click on a phone number from a webpage and have it automatically start dialing the number. (Similar to how iPhone handles this) Anyone every deploy something like this? I would imagine it would require some kind of background running ShoreTel process to accommodate this. 90% of our employees use PC (Windows 7) 10% use OS X Even a PC only solution would be great. Is this even possible and if so, where should one begin? Thanks!

    Read the article

  • Setup dual screen with Catalyst Control Center on notebook so that nothing changes when closing lid (case) ?

    - by Tom
    I have setup a dualscreen setup for my notebook with an ATI video card (ATI mobility radeon hd 5470) using Catalyst Control Center. The monitor extends the desktop of the notebook's screen. This works great. I am running on windows 7 and have set it up to not do anything when I close my laptop's lid (case) other than dim the notebook's display. However, when I do close the lid (notebook's display), the second monitor suddenly changes and I believe it starts behaving as the primary monitor.. I do not want this to happen. So, how do I setup my dualscreen setup so that it won't set the secondary monitor as the primary one when I close my laptop's lid? UPDATE: this video shows exactly what my problem is: http://www.youtube.com/watch?v=Q1ygNHbqI2g

    Read the article

  • What virtualization solution should I try next, after hitting problems w/ VMWare Player in dual-netw

    - by Alex R
    I have been using VMWare Player 2.5 for a while (Ubuntu guest on Vista host, 32-bit). VMWare had worked great until now but then I hit a brick wall: Due to some reorganization of my home network, the host machine now has to use a wireless connection to reach the Internet, while the printer, fileserver, and other important stuff are attached to a local gigabit hub. I have tried several tricks, such as editing the .vmx file, changing settings in vmnetcfg, etc, but I'm still unable to get the virtual Ubuntu box to connect each of the two virtual NICs to different networks (I did get it to recognize two NICs, but both DHCP'd onto the gigabit LAN). So, I'm ready to dump vmware for something with a little more low-level control of network settings. Virtualization is such a crowded space, I could spend months evaluating every product out there. I'm hoping for a shortcut... Can anyone recommend the best VM for my situation described above? Thanks

    Read the article

  • Messaging with KnockoutJs

    - by Aligned
    MVVM Light has Messaging that helps keep View Models decoupled, isolated, and keep the separation of concerns, while allowing them to communicate with each other. This is a very helpful feature. One View Model can send off a message and if anyone is listening for it, they will react, otherwise nothing will happen. I now want to do the same with KnockoutJs View Models. Here are some links on how to do this: http://stackoverflow.com/questions/9892124/whats-the-best-way-of-linking-synchronising-view-models-in-knockout http://www.knockmeout.net/2012/05/using-ko-native-pubsub.html ~ this is a great article describing the ko.subscribable type. http://jsfiddle.net/rniemeyer/z7KgM/ ~ shows how to do the subscription https://github.com/rniemeyer/knockout-postbox will be used to help with the PubSub (described in the blog post above) through the Nuget package. http://jsfiddle.net/rniemeyer/mg3hj/ of knockout-postbox   Implementation: Use syncWith for two-way synchronization. MainVM: self.selectedElement= ko.observable().syncWith (“selectedElement”); ElementListComponentVM example: self.selectedElement= ko.observable().syncWith(“selectedElement”); ko.selectedElement.subscribe(function(){ // do something with the seletion change }); ElementVMTwo: self.selectedElement= ko.observable().syncWith (“selectedElement”); // subscribe example ko.postbox.subscribe(“changeMessage”, function(newValue){ }); // or use subscribeTo this.visible = ko.observable().subscribeTo("section", function(newValue) { // do something here }); · Use ko.toJS to avoid both sides having the same reference (see the blog post). · unsubscribeFrom should be called when the dialog is hidden or closed · Use publishOn to automatically send out messages when an observable changes o ko.observable().publishOn(“section”);

    Read the article

  • Can certain system-hungry modules be disabled in Ubuntu?

    - by Ole Thomsen Buus
    Hi, Let me add some context: I am currently using Ubuntu 9.10 64-bit (Desktop) on a relatively powerful stationary PC (Intel Core i7 920, 12GB ram). My purpose is highspeed imaging with a pointgrey Grashopper machine-vision camera (for research, PhD project). This camera is capable of 200 fps at full VGA (640x480) resolution. The camera is connected using Firewire (1394b) and the drivers and software from Pointgrey works great. I have developed a console C++ application that can grap a certain number of frames to preallocated memory and after this also save the grapped frames to harddrive. Currently it works fine but sometimes I am observing a few framedrops (1-3). When this happens I reset the experiment and repeat the recording and usually i am lucky the second time with no framedrops (the camera-driver has a internal framecounter that I am using). Question: I usually go to tty1 and use "sudo service gdm stop" to disable the graphical frontend. It seems to release some memory though that is not my main concern. My concern is CPU resources. Are there other system hungry modules that can be disabled temporarily such that the CPU gets less busy on Ubuntu 9.10? At some point in the future I will update to 10.10. Should I perhaps option for the server edition instead? Thanks.

    Read the article

  • Reformatting and version control

    - by l0b0
    Code formatting matters. Even indentation matters. And consistency is more important than minor improvements. But projects usually don't have a clear, complete, verifiable and enforced style guide from day 1, and major improvements may arrive any day. Maybe you find that SELECT id, name, address FROM persons JOIN addresses ON persons.id = addresses.person_id; could be better written as / is better written than SELECT persons.id, persons.name, addresses.address FROM persons JOIN addresses ON persons.id = addresses.person_id; while working on adding more columns to the query. Maybe this is the most complex of all four queries in your code, or a trivial query among thousands. No matter how difficult the transition, you decide it's worth it. But how do you track code changes across major formatting changes? You could just give up and say "this is the point where we start again", or you could reformat all queries in the entire repository history. If you're using a distributed version control system like Git you can revert to the first commit ever, and reformat your way from there to the current state. But it's a lot of work, and everyone else would have to pause work (or be prepared for the mother of all merges) while it's going on. Is there a better way to change history which gives the best of all results: Same style in all commits Minimal merge work ? To clarify, this is not about best practices when starting the project, but rather what should be done when a large refactoring has been deemed a Good Thing™ but you still want a traceable history? Never rewriting history is great if it's the only way to ensure that your versions always work the same, but what about the developer benefits of a clean rewrite? Especially if you have ways (tests, syntax definitions or an identical binary after compilation) to ensure that the rewritten version works exactly the same way as the original?

    Read the article

  • How to disable color dithering for low-bit-depth screen settings?

    - by gogowitsch
    I am using Terminal Services and TeamViewer a lot to access other computers, partly over slow networks. The problem described below is not affected by which of the two remote access services I am using. When accessing Windows 7 Professional machines, a great deal of text is hard to read as the background is dithered. Even for exactly the same colors, Windows 2003 does not seem to dither at all, but to choose the closest available color. I strongly prefer the latter, as I don't care for the exact colors, I just want to be able to read easily. I am not sure whether this is operating system-related. The programs on the remote systems do not allow me to change the color choices for the various backgrounds to anything sane. Is there a way to disable this color dithering using some target operating system setting that will do the trick for both Terminal Services and TeamViewer?

    Read the article

  • Need hard disk recommendation for linux home server.

    - by neotracker
    Hello, I'm planing to build a little linux homeserver. It will mainly be used for storage and maybe as an media pc. I plan to build a software raid5 with 4 1.5TB or 2TB hard drives. I already decided to use the Western Digital Caviar Green 1.5 TB drive, but then I read about some problems with the WD green series about many drives failing and that they are not recommended for raid anyway. Of course, I couldn't find much facts on the issues so I thought I just ask here ;-) What hard drives would you recommended for a software raid5 setup? As I only need it for storage, the whole thing doesn't have to be too fast. So I prefer a cheap price and silence to great performance.

    Read the article

  • Sweden Windows Azure Group Meeting in November &amp; Fast with Windows Azure Competition

    - by Alan Smith
    SWAG November Meeting There will be a Sweden Windows Azure Group (SWAG) meeting in Stockholm on Monday 19th November. Chris Klug will be presenting a session on Windows Azure Mobile Services, and I will be presenting a session on Web Site Authentication with Social Identity Providers. Active Solution have been kid enough to host the event, and will be providing food and refreshments. The registration link is here: http://swag14.eventbrite.com If you would like to join SWAG the link is here: http://swagmembership.eventbrite.com Fast with Windows Azure Competition I’ve entered a 3 minute video of rendering a 3D animation using 256 Windows Azure worker roles in the “Fast with Windows Azure” competition. It’s the last week of voting this week, it would be great if you can check out the video and vote for it if you like it. I have not driven a car for about 15 years, so if I win you can expect a hilarious summery of the track day in Vegas. My preparation for the day would be to play Project Gotham Racing for a weekend, and watch a lot of Top Gear.   My video is “Rapid Massive On-Demand Scalability Makes Me Fast!”. The link is here: http://www.meetwindowsazure.com/fast/

    Read the article

  • Why does plymouth start so late?

    - by Marky
    It appears that starting with 11.04 Plymouth starts so late in the boot process. Sometimes I only have a split second to see it before it transitions to the login screen. This is the same for 11.10. Compared to 10.04 and 10.10, Plymouth starts only a couple seconds or so after Grub and is very visible within the entire boot process. Is there something that can be done to have Plymouth run earlier? I have experienced this on 3 different machines and on 2 of these machines, I've been running Ubuntu since 10.04. So it's not just my notebook's hardware that is causing this. *One a side note, the boot process is one of the ugliest parts of modern Linux. Ubuntu is not excluded. After almost a decade, (I forget but was bootsplash the first?) this still has only been partly solved. For a couple of seconds ugly text is still seen when shutting down. On several ocassions, the same ugly text is seen when logging out of a session. It's never as smooth as you want it to be. Splash themes are great, don't get me wrong. It's just the transitions that are way off and you get glimpses of what's underneath. I'm used to this but for those new to Ubuntu and coming from Windows. It is a turn off.* pardon the rant. :)

    Read the article

  • So my employer wants me to do less programming and focus on IT support

    - by Rich
    I was hired into a non tech company's IT department as a programmer a few years back, and after several rounds of lay offs, we're down to a skeleton crew. I've saved the company hundreds of thousands of dollars with my projects and management has been happy with them (although most of the stakeholders have since left the company). Management now wants me to limit the programming that I do and spend most of my time on IT support: putting out fires, dealing with vendors, outsourced contractors, supporting company systems, managing projects, etc. I am a little burnt out on programming since I've been pushed pretty hard for the past several years. However, I'm not sure if this is a good career move in the long run. I'm a decent programmer (and also good with databases) but not obsessed with it to the point of coding outside of work. I'm approaching my mid 30s and there's potential ageism to deal with down the line. While I'm fortunate to have survived the lay offs, it sorta feels like my job is being "dumbed down". I have both good technical skills and people skills...but it doesn't take a genius to do what I'm doing now. And my success is being increasingly linked to others' performance rather than my own... Just looking for some advice. Is it time to move on? That's not really an easy thing to do since I'd likely have to move to another area to find another comparable tech job. Should I go after another pure technical role? Or should I stay and try to make this work? People say do what you "enjoy" but it doesn't really matter to me as long as I'm getting paid. Also the ageism thing is on the horizon and could be an issue eventually. I'm making a decent (but not great) salary. Should I chase money and maximize my income while I still have a chance? Or be happy with a moderate salary and 40 hour work week?

    Read the article

  • How To Export/Import a Website in IIS 7.x

    - by Tray Harrison
    IIS 6 had a great feature called ‘Save Configuration to a File’ which would allow you to easily export a website’s configuration, to be later used to import either on the same server or another box.  This came in handy anytime you wanted to duplicate a site in order to do some testing without impacting the existing application.  So naturally, Microsoft decided to do away with this feature in IIS 7. The process to export/import a site is still fairly simple, though not as obvious as it was in previous versions.  Here are the steps: 1. Open a command prompt and navigate to C:\Windows\System32\inetsrv and run the following command: appcmd list site /name:<sitename> /config /xml > C:\output.xml So if you were wanting to export a website named EAC, you would run the following: If you’ll be setting up another copy of the site on the same server, you’ll now need to edit the output.xml file before importing it.  This is necessary in order to avoid conflicts such as bindings, Site ID, etc.  To do this, edit the XML and change the values.  Go ahead and make a copy of the home directory, and rename it to whatever folder name you specified in the output – /EAC2 in this example.  If you decide to change the app pool, make sure you go ahead and create the new app pool as well. Once these edits have been made, we are now ready to import the site.  To do that run: appcmd add sites /in < c:\output.xml So for our example it would look like this: That’s it.  You should now see your site listed when opening up Inet Manager.  If for some reason the site fails to start, that’s probably because you forgot to create the new app pool or there is a problem with one of the other parameters you changed.  Look at the System log to identify any issues like this.

    Read the article

  • Bash: Quotes getting stripped when a command is passed as argument to a function

    - by Shoaibi
    I am trying to implement a dry run kind of mechanism for my script and facing the issue of quotes getting stripped off when a command is passed as an argument to a function and resulting in unexpected behavior. dry_run () { echo "$@" #printf '%q ' "$@" if [ "$DRY_RUN" ]; then return 0 fi "$@" } email_admin() { echo " Emailing admin" dry_run su - $target_username -c "cd $GIT_WORK_TREE && git log -1 -p|mail -s '$mail_subject' $admin_email" echo " Emailed" } Output is: su - webuser1 -c cd /home/webuser1/public_html && git log -1 -p|mail -s 'Git deployment on webuser1' [email protected] Expected: su - webuser1 -c "cd /home/webuser1/public_html && git log -1 -p|mail -s 'Git deployment on webuser1' [email protected]" With printf enabled instead of echo: su - webuser1 -c cd\ /home/webuser1/public_html\ \&\&\ git\ log\ -1\ -p\|mail\ -s\ \'Git\ deployment\ on\ webuser1\'\ [email protected] Result: su: invalid option -- 1 That shouldn't be the case if quotes remained where they were inserted. I have also tried using "eval", not much difference. If i remove the dry_run call in email_admin and then run script, it work great.

    Read the article

  • How do I change the NGINX user?

    - by danielfaraday
    I have a PHP script that creates a directory and outputs an image to the directory. This was working just fine under Apache but we recently decided to switch to NGINX to make more use of our limited RAM. I'm using the PHP mkdir() command to create the directory: mkdir(dirname($path['image']['server']), 0755, true); After the switch to NGINX, I'm getting the following warning: Warning: mkdir(): Permission denied in ... I've already checked all the permissions of the parent directories, so I've determined that I probably need to change the NGINX or PHP-FPM 'user' but I'm not sure how to do that (I never had to specify user permissions for APACHE). I can't seem to find much information on this. Any help would be great! (Note: Besides this little hang-up, the switch to NGINX has been pretty seamless; I'm using it for the first time and it literally only took about 10 minutes to get up and running with NGINX. Now I'm just ironing out the kinks.)

    Read the article

  • Is it worth moving from Microsoft tech to Linux, NodeJS & other open source frameworks to save money for a start-up?

    - by dormisher
    I am currently getting involved in a startup, I am the only developer involved at the moment, and the other guys are leaving all the tech decisions up to me at the moment. For my day job I work at a software house that uses Microsoft tech on a day to day basis, we utilise .NET, SqlServer, Windows Server etc. However, I realise that as a startup we need to keep costs down, and after having a brief look at the cost of hosting for Windows I was shocked to see some of the prices for a dedicated server. The cheapest I found was £100 a month. Also if the business needs to scale in the future and we end up needing multiple servers, we could end up shelling out £10's of £000's a year in SQL Server / Windows Server licenses etc. I then had a quick look at the price of Linux hosting for a dedicated server and saw the price was waaaaaay lower than windows hosting. One place was offering a machine with 2 cores for less than £20 a month. This got me thinking maybe the way to go is open source on Linux. As I write a lot of Javascript at work (I'm working on a single page backbone app at the moment), I thought maybe NodeJS and a web framework like Express would be cool to use. I then thought that instead of using SQL why not use an open source NoSQL database like MongoDB, which has great support on NodeJS? My only concern is that some of the work the application is going to do is going to be dynamically building images and various other image related stuff, i.e. stuff that is quite CPU heavy - so I'm thinking of maybe writing anything CPU heavy in C++ and consuming it as a module in Node. That's the background - but basically is Linux a good match for: Hosting a NodeJS/Express site? Compiling C++ node modules? Using a NoSQL DB like MongoDB? And is it a good idea to move to these unfamiliar technologies to save money?

    Read the article

  • Is there a NVIDIA driver (for a 7-series card) that will actually work for 12.10?

    - by DS13
    I see many similar topics on this, but I've tried all their suggestions, and nothing has worked. ISSUE: I do a clean install of Ubuntu 12.10. Boots fine with the “nouveau” graphics driver – graphics are very slow and choppy. The three other driver options in Ubuntu (official NVIDIA drivers), all result in a variation of the black screen on boot up. There will be NO access to a command line/GUI in anyway what-so-ever (tried every option recommended out there, but the system is unusable at this stage). I can only reinstall, and try different drivers…and I only ever get one shot at it. QUESTIONS: Does anyone know of a NVIDIA driver that will actually work with a Nvidia GeForce 7350LE? Or a 7-series card in general? This is my second computer, and I’m just trying to get a working install of Ubuntu on it. I don’t want to put much money into it, as I have seen Ubuntu run great on much older/less capable machines. I’ve got a decent Intel processor (2.3Ghz), 2GB of RAM, 320GB hard drive, 32-bit architecture, and there is no other O/S installed. It appears as if the graphics card is holding me back. Should I just buy a cheap graphics card (non-NVIDIA) to put in as a replacement? TRIED SO FAR: -all drivers available in Ubuntu *all fail -manual install of some different NVIDIA drivers *all fail -also tried installing the generic kernel, Nvidia driver doesn't work in 12.10 *no difference -every method suggested to at least get a command line after switching to a NVIDIA driver *all fail

    Read the article

  • Windows Azure v1.7 Spring Release Today&ndash;New Management Dashboard

    - by ToStringTheory
    Today, Microsoft will be publicly releasing a new version of Azure for public consumption.  The web conference, at http://www.meetwindowsazure.com will be airing at 1 PM PST.  They have already released an update to the Service Dashboard that can be accessed by going to http://manage.windowsazure.com.  I have some images of the new dashboard here that I have gathered and removed any PII from.  Let me know what you think! Images You should be able to click any of the images for a full resolution image. Tutorial The first thing you get after signing in is the tutorial: Landing After the tutorial completes, you get a screen with services that are active on your account on the left, and a list of ALL services (db/blob/SQL Azure) on the right.  I like the quick access to services across any of my subscriptions: Service Information These are images from a running web site with several roles.  I love how easy they have made many of the features: SQL Azure They have given some great quick functionality for looking at your DB information: Storage Here is the basic information that they give you for any storage accounts you have: Adding Services Super quick and easy to add services with the new UI: Conclusion I am EXCITED!  As you may have seen in the left side of my blog, I am an MCPD in Azure Development, and I must say that I am excited to see Microsoft moving forward with the technology and not letting it stagnate.  After as much as I have fought the other Azure dashboard, I like the friendliness and fluidity of this one. The important thing to note about ALL of the images above: this is HTML, not Silverlight.  The responsiveness is FAST on all of the actions I completed, and I believe that this is a big step forward for Azure… So, what do you think?

    Read the article

  • Contractor - Mispaid again - Walked out

    - by MeshMan
    Hi all, I'm wondering whether or not I've done the right thing as a contractor. Basically I'm in to my 3rd month, and my current client messed up the payment in the first month, and I just found out that they are again late in paying me for my 2nd month. It wouldn't be so bad if I wasn't in a bit of a financial situation due to this being my first contract experience. But as a matter of principal, I walked out on them and will be telling them that I will not be going back in until they resolve the pay. Part of me feels as though this was not a very professional thing to do, but I also don't feel that it was very professional for them to mess my payments up, twice in a row. Did I make the right decision? I still want to work for these guys and enjoy the job, but I have a life to attend to that requires finances and I can't afford to keep getting messed up with pay like this. I attempted to phrase the question to be oriented around contractors behaviour around clients that mis-treat them, and as some of the answers that have been posted so far, it's a good discussion. The answers coming in are great around different subjective situations.

    Read the article

< Previous Page | 261 262 263 264 265 266 267 268 269 270 271 272  | Next Page >