Search Results

Search found 24450 results on 978 pages for 'microsoft updates'.

Page 22/978 | < Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >

  • Why can't I install Microsoft Office 2007 in Ubuntu 11.04?

    - by DK new
    I am very new to Ubuntu and only just getting a hang of it, and my questions might sound stupid especially because I am a learner in terms of techie things as well. So because of the nature of work where everyone uses stupid Windows and Microsoft, I need to have access to MS Office 2007/2010 as documents with too many tables or images open all haywire in Libre Office (which has otherwise been great!). I have been reading up about installing MS Office through WINE/PlayonLinux, but have been unsuccessful so far. I downloaded a MS Office 2007 package from Pirate Bay, which I extracted into a folder. I tried numerous different ways to install through WINE and PlayonLinux, but will discuss the one which seems to be getting me somewhere. http://www.webupd8.org/2011/01/how-to-install-microsoft-office-2007-in.html ..... Initially, when I would click on the install button of MS Office, I get a message saying "The install location you selected does not have 1558MB free space. Free up space from the selected install location or choose a different install location". The install location in this case said "C:\Program Files\Microsoft Office", which confused me as I don't have drives named as C, Z etc. I went to configure WINE and under the drives tab, created a drive named A with the path location /media/cd025f16-433b-4a90-abb6-bb7a025d0450/. Also the space thing is confusing as I have at least 450GB of unused space on my computer. anyways, when I selected the A drive for installation, the installation starts, but soon I get the following error message, "Office cannot find Office.en-us\OfficeLR.Cab. Browse to a valid installation source" .... The part saying "OfficeLR.Cab" have said different things after the Office bit every time I have made an attempt. When I select the Office.en-us sub-folder or any other folder within the folder where MS Office 2007 is saved, it says "invalid source"! I have been trying to get this sorted since 15hrs now (addictive!) and have learnt loads of things in the process, but have not managed to crack it. It might be something stupidly simple I am not aware off that is stopping it. I would really appreciate some help! Thanks a lot.. Also I am still getting used to the language, so might have many questions Also I am using Ubuntu 11.04 (tag 11.04). Also I think I don't have windows -- when my friend installed Ubuntu on my new laptop which had Windows 7, he was trying to keep windows in a separate partition, but something happened and windows was not there! Looking forward to some support! Again thanks a lot

    Read the article

  • No user/password boxes on login screen after updates today (June 30, 2012)

    - by Tony
    Just installed today's batch of updates, including new kernel 3.2.0-26 and rebooted. Now the screen just has a logo in the middle and "Ubuntu 12.04 LTS" in bottom left corner, but no box to choose which user to log in as, or to enter a password. CTRL-ALT-F1 gets me to a "login:" prompt, and I can log in - but I have no idea what to look at to find out what is wrong, or to fix this. Tried older kernels and recovery mode for current kernel - no joy, still no way to log into the graphics console.

    Read the article

  • Updates to Silverlight Code Browser

    A couple of bits of news. I did a number of updates to the HackingSilverlightCodeBrowser here:http://www.hackingsilverlight.net/HackingSilverlightCodeBrowser.html including things like MEF and IsolatedStorage. As to the community edition of the book, I'm still trying to get some contributors to finish. I've been slammed with 60 hour weeks so chapter 2 and 4 and appendix a still need edits. maybe a weeks worth of work pending time which tends to be limited in my life. :)...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Oneiric updates cause "Window Creation Error" in Second Life client

    - by Yuttadhammo
    With the latest Oneiric updates, I've been unable to use Second Life, a virtual reality simulator, in Ubuntu. I assume it has something to do with the NVidia driver, since the error in the console says: WARNING: createWindow: LLWindowManager::create() : Error creating window. WARNING: LLViewerWindow: Failed to create window, to be shutting Down, be sure your graphics driver is updated I guess I could file a bug on Launchpad, but I'm not sure where to file it. Does anyone have any insight into what may be causing this problem? It was working fine a week ago, and older viewers (Imprudence viewer anyway) still work fine.

    Read the article

  • 12.04 gnome shell broken after updates

    - by nat
    I'm running Ubuntu 12.04, and using gnome shell. I've had my machine running for the last few days, and I've been installing updates as the update manager bugs me. I just rebooted, and now gnome won't start. I can use gnome classic and unity, but gnome 3 isn't working at all. When I log in, the screen is black for maybe 20 seconds, but the cursor shows. Then, my wallpaper, but nothing else shows up. I can get a terminal with ctrl+alt+t, and I tried to run gnome-shell, but it segfaulted.

    Read the article

  • Ubuntu 13.10 problems in apt-get update

    - by user205814
    I recently install Ubuntu 13.10, but I had several difficulties on installing several programs from 'Ubuntu Software Center'. I tried to update the repositories but I get the follow result (the * are mine since I cant put more than 2 links): Ign http*://security.ubuntu.com saucy-security InRelease Ign http*://extras.ubuntu.com saucy InRelease Hit http*://security.ubuntu.com saucy-security Release.gpg Hit http*://extras.ubuntu.com saucy Release.gpg Hit http*://security.ubuntu.com saucy-security Release Hit http*://extras.ubuntu.com saucy Release Hit http*://security.ubuntu.com saucy-security/main Sources Hit http*://extras.ubuntu.com saucy/main Sources Hit http*://security.ubuntu.com saucy-security/restricted Sources Hit http*://extras.ubuntu.com saucy/main amd64 Packages Hit http*://security.ubuntu.com saucy-security/universe Sources Hit http*://extras.ubuntu.com saucy/main i386 Packages Hit http*://security.ubuntu.com saucy-security/multiverse Sources Hit http*://security.ubuntu.com saucy-security/main amd64 Packages Hit http*://security.ubuntu.com saucy-security/restricted amd64 Packages Hit http*://security.ubuntu.com saucy-security/universe amd64 Packages Hit http*://security.ubuntu.com saucy-security/multiverse amd64 Packages Hit http*://security.ubuntu.com saucy-security/main i386 Packages Hit http*://security.ubuntu.com saucy-security/restricted i386 Packages Hit http*://security.ubuntu.com saucy-security/universe i386 Packages Hit http*://security.ubuntu.com saucy-security/multiverse i386 Packages Ign http*://extras.ubuntu.com saucy/main Translation-en_US Ign http*://extras.ubuntu.com saucy/main Translation-en Hit http*://security.ubuntu.com saucy-security/main Translation-en Hit http*://security.ubuntu.com saucy-security/multiverse Translation-en Hit http*://security.ubuntu.com saucy-security/restricted Translation-en Hit http*://security.ubuntu.com saucy-security/universe Translation-en Ign http*://security.ubuntu.com saucy-security/main Translation-en_US Ign http*://security.ubuntu.com saucy-security/multiverse Translation-en_US Ign http*://security.ubuntu.com saucy-security/restricted Translation-en_US Ign http*://security.ubuntu.com saucy-security/universe Translation-en_US Err http*://us.archive.ubuntu.com saucy InRelease Err http*://us.archive.ubuntu.com saucy-updates InRelease Err http*://us.archive.ubuntu.com saucy-backports InRelease Err http*://us.archive.ubuntu.com saucy Release.gpg Cannot initiate the connection to us.archive.ubuntu.com:80 (2001:67c:1562::14). - connect (101: Network is unreachable) [IP: 2001:67c:1562::14 80] Err http*://us.archive.ubuntu.com saucy-updates Release.gpg Cannot initiate the connection to us.archive.ubuntu.com:80 (2001:67c:1562::14). - connect (101: Network is unreachable) [IP: 2001:67c:1562::14 80] Err http*://us.archive.ubuntu.com saucy-backports Release.gpg Cannot initiate the connection to us.archive.ubuntu.com:80 (2001:67c:1562::14). - connect (101: Network is unreachable) [IP: 2001:67c:1562::14 80] Reading package lists... Done W: Failed to fetch http*://us.archive.ubuntu.com/ubuntu/dists/saucy/InRelease W: Failed to fetch http*://us.archive.ubuntu.com/ubuntu/dists/saucy-updates/InRelease W: Failed to fetch http*://us.archive.ubuntu.com/ubuntu/dists/saucy-backports/InRelease W: Failed to fetch http*://us.archive.ubuntu.com/ubuntu/dists/saucy/Release.gpg Cannot initiate the connection to us.archive.ubuntu.com:80 (2001:67c:1562::14). - connect (101: Network is unreachable) [IP: 2001:67c:1562::14 80] W: Failed to fetch http*://us.archive.ubuntu.com/ubuntu/dists/saucy-updates/Release.gpg Cannot initiate the connection to us.archive.ubuntu.com:80 (2001:67c:1562::14). - connect (101: Network is unreachable) [IP: 2001:67c:1562::14 80] W: Failed to fetch http*://us.archive.ubuntu.com/ubuntu/dists/saucy-backports/Release.gpg Cannot initiate the connection to us.archive.ubuntu.com:80 (2001:67c:1562::14). - connect (101: Network is unreachable) [IP: 2001:67c:1562::14 80] W: Some index files failed to download. They have been ignored, or old ones used instead. I want to install Seaview, Dropbox, Terminator and the IDLE of python 2.7, but I can't since I get 'There isn’t a software package called “” in your current software sources' or 'Available from the "multiverse" source. However, for this last one, when I do click over "Use this Source" nothing happens. I need help. Tx to all.

    Read the article

  • DropDownList and SelectListItem Array Item Updates in MVC

    - by Rick Strahl
    So I ran into an interesting behavior today as I deployed my first MVC 4 app tonight. I have a list form that has a filter drop down that allows selection of categories. This list is static and rarely changes so rather than loading these items from the database each time I load the items once and then cache the actual SelectListItem[] array in a static property. However, when we put the site online tonight we immediately noticed that the drop down list was coming up with pre-set values that randomly changed. Didn't take me long to trace this back to the cached list of SelectListItem[]. Clearly the list was getting updated - apparently through the model binding process in the selection postback. To clarify the scenario here's the drop down list definition in the Razor View:@Html.DropDownListFor(mod => mod.QueryParameters.Category, Model.CategoryList, "All Categories") where Model.CategoryList gets set with:[HttpPost] [CompressContent] public ActionResult List(MessageListViewModel model) { InitializeViewModel(model); busEntry entryBus = new busEntry(); var entries = entryBus.GetEntryList(model.QueryParameters); model.Entries = entries; model.DisplayMode = ApplicationDisplayModes.Standard; model.CategoryList = AppUtils.GetCachedCategoryList(); return View(model); } The AppUtils.GetCachedCategoryList() method gets the cached list or loads the list on the first access. The code to load up the list is housed in a Web utility class. The method looks like this:/// <summary> /// Returns a static category list that is cached /// </summary> /// <returns></returns> public static SelectListItem[] GetCachedCategoryList() { if (_CategoryList != null) return _CategoryList; lock (_SyncLock) { if (_CategoryList != null) return _CategoryList; var catBus = new busCategory(); var categories = catBus.GetCategories().ToList(); // Turn list into a SelectItem list var catList= categories .Select(cat => new SelectListItem() { Text = cat.Name, Value = cat.Id.ToString() }) .ToList(); catList.Insert(0, new SelectListItem() { Value = ((int)SpecialCategories.AllCategoriesButRealEstate).ToString(), Text = "All Categories except Real Estate" }); catList.Insert(1, new SelectListItem() { Value = "-1", Text = "--------------------------------" }); _CategoryList = catList.ToArray(); } return _CategoryList; } private static SelectListItem[] _CategoryList ; This seemed normal enough to me - I've been doing stuff like this forever caching smallish lists in memory to avoid an extra trip to the database. This list is used in various places throughout the application - for the list display and also when adding new items and setting up for notifications etc.. Watch that ModelBinder! However, it turns out that this code is clearly causing a problem. It appears that the model binder on the [HttpPost] method is actually updating the list that's bound to and changing the actual entry item in the list and setting its selected value. If you look at the code above I'm not setting the SelectListItem.Selected value anywhere - the only place this value can get set is through ModelBinding. Sure enough when stepping through the code I see that when an item is selected the actual model - model.CategoryList[x].Selected - reflects that. This is bad on several levels: First it's obviously affecting the application behavior - nobody wants to see their drop down list values jump all over the place randomly. But it's also a problem because the array is getting updated by multiple ASP.NET threads which likely would lead to odd crashes from time to time. Not good! In retrospect the modelbinding behavior makes perfect sense. The actual items and the Selected property is the ModelBinder's way of keeping track of one or more selected values. So while I assumed the list to be read-only, the ModelBinder is actually updating it on a post back producing the rather surprising results. Totally missed this during testing and is another one of those little - "Did you know?" moments. So, is there a way around this? Yes but it's maybe not quite obvious. I can't change the behavior of the ModelBinder, but I can certainly change the way that the list is generated. Rather than returning the cached list, I can return a brand new cloned list from the cached items like this:/// <summary> /// Returns a static category list that is cached /// </summary> /// <returns></returns> public static SelectListItem[] GetCachedCategoryList() { if (_CategoryList != null) { // Have to create new instances via projection // to avoid ModelBinding updates to affect this // globally return _CategoryList .Select(cat => new SelectListItem() { Value = cat.Value, Text = cat.Text }) .ToArray(); } …}  The key is that newly created instances of SelectListItems are returned not just filtered instances of the original list. The key here is 'new instances' so that the ModelBinding updates do not update the actual static instance. The code above uses LINQ and a projection into new SelectListItem instances to create this array of fresh instances. And this code works correctly - no more cross-talk between users. Unfortunately this code is also less efficient - it has to reselect the items and uses extra memory for the new array. Knowing what I know now I probably would have not cached the list and just take the hit to read from the database. If there is even a possibility of thread clashes I'm very wary of creating code like this. But since the method already exists and handles this load in one place this fix was easy enough to put in. Live and learn. It's little things like this that can cause some interesting head scratchers sometimes…© Rick Strahl, West Wind Technologies, 2005-2012Posted in MVC  ASP.NET  .NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • SQL Server 2005 SP4 is here!

    - by AaronBertrand
    Yes, the day has finally arrived, and a couple of weeks ahead of schedule. Typically when Microsoft promises a release in Qx or Hx, the software comes on the last or second last day of that quarter or half. This year, we get an early Christmas present: SQL Server 2005 SP4. To download SP4, go to this link: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=b953e84f-9307-405e-bceb-47bd345baece If you are looking for the Express versions of SP4, you can get Express, Express with Tools, and...(read more)

    Read the article

  • What changes can be made to a Microsoft Account using net user on Windows 8?

    - by nhinkle
    In Windows 8, you can log on with a local account or with a Microsoft Account. Both types show up in the Users control panel, but there are different options that you can change for each type. An administrator can change basically any aspect of a local user - name, password, access level, etc. For a Microsoft Account, you can only change the access level (admin/standard) or remove the account. From the command line though, there don't appear to be any restrictions on what you can do a Microsoft Account. Using the net user tool or the Local Users and Groups MMC snap-in, it looks like an administrator would be able to change the password, display name, profile path, etc. of Microsoft Accounts (as well as local accounts of course). Will these commands actually work when applied to a Microsoft Account? Will using them in some way break the link to the Microsoft Account, or will nothing happen at all? I'm hesitant to test since I don't want to mess up my account permanently.

    Read the article

  • Avoid Powerpoint(/Word) 2010 creating temporary files in working directory

    - by Ben
    When opening a temporary file, Powerpoint 2010 usually creates a temporary, hidden file called ~$filename.pptx in the same directory. This is undesirable, since it can cause unnecessary activity with e.g. Dropbox. Furthermore, the "Documents" folder should not be used for temporary files -- we have the %TEMP% folder for that. So, is it possible to have Powerpoint create its temporary files in %TEMP% instead? The following link suggests that it might not be possible: http://support.microsoft.com/kb/211632 Also, why does Microsoft not use the %TEMP% folder?

    Read the article

  • Excel - working in a bank

    - by Einsteins Grandson
    I am supposed to go to an interview to a bank for just supporting managers in projects. It's a part-time job and the thing is that bank uses Excel for everything. Modifications of tables of really lot of data... What can I expect to find in the test of Excel? I have some books that are around 1000 pages thick but I don't have time and also don't feel like reading everything that's in them. These are the books that I have: http://www.amazon.com/Excel-2010-Bible-John-Walkenbach/dp/0470474874/ref=sr_1_1?ie=UTF8&qid=1347571864&sr=8-1&keywords=excel+bible http://www.amazon.com/Excel-2010-The-Missing-Manual/dp/1449382355/ref=sr_1_1?ie=UTF8&qid=1347571884&sr=8-1&keywords=Excel+2010+The+Missing+Manual http://www.amazon.com/Microsoft-Excel-2010-In-Depth/dp/0789743086/ref=sr_1_1?ie=UTF8&qid=1347571904&sr=8-1&keywords=Microsoft+Excel+2010+In+Depth So, anybody knows a good online tutorial or a book that would contain the basics and was not that much thick? ;-) Thanks so much!!!

    Read the article

  • APT: Hold packages back from updates without APT Pin

    - by David
    I know about pinning packages with APT; that's not what I want to do. Other questions have been answered with either using pinning or by using pins temporarily. I don't want to do this... What I want to do is keep packages back the same way the kernel has been: # apt-get upgrade Reading package lists... Done Building dependency tree Reading state information... Done The following packages have been kept back: linux-generic-pae linux-headers-generic-pae linux-image-generic-pae The following packages will be upgraded: I want to add tomcat-* and mysql-* and sun-* to this list. In the past, there was a configuration parameter to do this - I've always thought it was something like Apt::Get::HoldPkgs or Apt::HoldPkgs but I can't find it. I want to have these packages held from updates until I specifically request them with an "apt-get install". I found the apt-get configuration Apt::NeverAutoRemove; will this do what I want? Added Question: I notice that Apt::NeverAutoRemove and Apt::Never-MarkAuto-Sections (among others) are not documented so far as I can see; they're not in the manpages. Neither is aptitude::Keep-Unused-Pattern and aptitude::Get-Root-Command. Is there any comprehensive and complete documentation for apt.conf?

    Read the article

  • Get the latest Oracle VM updates

    - by Honglin Su
    We have released the latest Oracle VM updates for both x86 and SPARC.  For Oracle VM Server for SPARC: Oracle Solaris 11 SRU8.5 includes Oracle VM server for SPARC 2.2 so if you're already running a Solaris 11 as the control domain. All you need do is a 'pkg update' to get the latest 2.2 bits. Learn more how to upgrade to the latest Oracle VM Server for SPARC 2.2 release on Solaris 11 here and consult the documentation for further details. For Oracle VM Server for x86:  Download Oracle VM Manager 3.1.1 Patch Update from My Oracle Support, patch ID 14227416. With the latest Oracle VM Manager 3.1.1 build 365, you can explore Oracle VM Manager 3 Command Line Interface (CLI). Download Oracle VM Server Update from Oracle Unbreakable Linux Network. To receive notification on the software update delivered to Oracle ULN for Oracle VM, you can sign up here. For information on setting up an Oracle VM Server Yum repository and using Oracle VM Manager to perform the upgrade of Oracle VM Servers, see Updating and Upgrading Oracle VM Servers in the Oracle VM User's Guide For more information about Oracle's virtualization, visit oracle.com/virtualization.

    Read the article

  • Unable to install updates on 14.04 LTS

    - by Mike
    I have been getting update notifications for a few weeks now but whenever I attempt to install them I get this message; The upgrade needs a total of 74.6 M free space on disk '/boot'. Please free at least an additional 29.8 M of disk space on '/boot'. Empty your trash and remove temporary packages of former installations using 'sudo apt-get clean'. First of all I don't have permission to access /boot (don't know why as its a standalone machine and i'm the only user). Secondly, I emptied the trash; Thirdly, I launched Terminal and entered sudo apt-get clean I was a asked for a sudo password. I entered my system password. Re-entered sudo apt-get clean. The cursor stopped blinking - I assumed it was doing it's "thing". I let it go for about 10 minutes then exited Terminal. Tried to install the updates but just got the same message. Is there something i'm ignorant of? This is the output I get from the command df -h and I have no idea what it all means! @Tim, What's bash and why am I denied access to fstab and /boot? mike@mike-MS-7800:~$ /etc/fstab bash: /etc/fstab: Permission denied mike@mike-MS-7800:~$ df -h Filesystem Size Used Avail Use% Mounted on /dev/mapper/ubuntu--vg-root 913G 11G 856G 2% / none 4.0K 0 4.0K 0% /sys/fs/cgroup udev 1.7G 4.0K 1.7G 1% /dev tmpfs 335M 1.6M 333M 1% /run none 5.0M 4.0K 5.0M 1% /run/lock none 1.7G 14M 1.7G 1% /run/shm none 100M 52K 100M 1% /run/user /dev/sda2 237M 182M 43M 81% /boot /dev/sda1 487M 3.4M 483M 1% /boot/efi /dev/sr1 31M 31M 0 100% /media/mike/Optus Mobile mike@mike-MS-7800:~$ I ran this from the terminal and all is now working. dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d' | xargs sudo apt-get -y purge

    Read the article

  • JSR updates - November 2013

    - by Heather VanCura
     This week has been a busy week for JCP participants! Ten JSRs related to the upcoming Java Standard Edition (Java SE) 8 release posted public reviews--four Public Reviews and six Maintenance Reviews.  All JSRs are operating under the latest version of the JCP program and have public feedback mechanisms and issue trackers.  Please review and comment on these JSRs--your input and participation is wanted and needed!  JSR 308, Annotations on Java Types, published a Public Review. This review closes 4 December. JSR 310, Date and Time API, published a Public Review. This review closes 4 December. JSR 335, Lambda Expressions for the Java Programming Language, published a Public Review. This review closes 4 December. JSR 337, Java SE 8 Release contents, published a Public Review.  This review closes 4 December. JSR 221, JDBC 4.0 API, published a Maintenance Review.  This review closes 4 December. JSR 199, Java Compiler API, published a Maintenance Review.  This review closes 4 December. JSR 160, Java Management Extensions Remote API, published a Maintenance Review.  This review closes 4 December. JSR 114, JDBC Rowset Implementations, published a Maintenance Review.  This review closes 4 December. JSR 3, Java Management Extensions Specification, published a Maintenance Review.  This review closes 4 December. JSR 206, Java API for XML Processing,  published a Maintenance Review.  This review closes 22 November. Two other JSRs also published recent updates:  JSR 354, Money and Currency API, published a Public Review.  This review closes 23 November.  JSR 107, JCACHE - Java Temporary Caching API, published a Proposed Final Draft.

    Read the article

  • Java web app, with plugin framework and ability to connect to source for updates

    - by lessthancommon
    I've searched all around for some good sources, but either have been searching for the wrong keywords, or I'm just missing something. I'm looking to redevelop a web app I've been using for some time now. Many parts are out of date, and we're constantly throwing in little hacks to attempt to give it new life. So what I'd like to do is re-engineer it from the ground up, built on some sort of plug-in framework. Before I continue, I'm more or less an intermediate Java programmer. In some ways, I'm hoping to use this project as a big learning experience. I've read a lot about OSGi, and it seems that's the most complete framework. Ideally, I would like an end result web app which I can run one instance as my hosting environment, and other instances can connect to it to grab new and updated plug-ins. Eventually I'll want to lock down these plug-ins based on some undecided criteria of who can get them (basically some will simply be updates, others will provide new functionality and should be "purchased" through an external system). But that will probably be handled in a later phase. There should be an administration view for managing bundles in a hot environment (looking to avoid having to restart the server for an update). I know all these things are possible, I'm just trying to find some good resources for reference. All the OSGi tutorials I'm finding seem to be too simplistic. If anyone here can guide me in the right direction on any or all of the items I'm looking for, it would be much appreciated. Also, this is my first post, so I'll take any comments/criticisms about the content of my post. Thanks!

    Read the article

  • Feature Updates to the Windows Azure Portal

    - by Clint Edmonson
    Lots of activity over at the Windows Azure portal this weekend, including some exciting new features and major improvements to existing features. Here are the highlights: Support for Managing Co-administrators Set up account co-administrators to allow others to share service management duties for each Azure subscription Import/Export support for SQL Databases Export existing SQL Azure databases to blob storage using SQL Server 2012’s BACPAC format. Create a new SQL Azure database from an existing BACPAC stored in blob storage Storage Container Management and Access Control Create blob storage containers directly within the portal Edit their public/private access settings Drill into storage containers and see the blobs contained within them Improved Cloud Service Status Notifications Detailed health status information about cloud services and roles as they transition between states Virtual Machine Experience Enhancements Option to automatically delete corresponding VHD files from blob storage when deleting VM disks Service Bus Management and Monitoring Ability to create and manage service bus Namespaces, Queues, Topics, Relays and Subscriptions Rich monitoring of Topics, Queues, and Subscriptions with detailed and customizable dashboard metrics Entity status (Topic, Queue, or Subscription) can be changed interactively via dashboard Direct links to the Access Control Services (ACS) namespaces when working with service bus access keys Media Services Monitoring Support Monitor encoding jobs that are queued for processing as well as active, failed and queued tasks for encoding jobs The above features are all now live in production and available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today. Stay tuned to my twitter feed for Windows Azure announcements, updates, and links: @clinted Reference ID: P7VVJCM38V8R

    Read the article

  • Using Google App Engine to Perform World Updates vs an Authoritative Server

    - by Error 454
    I am considering different game server architectures that use GAE. The types of games I am considering are turn-based where the world status would need to be updated about once per minute. I am looking for an answer that persuades me to either perform the world update on the google servers OR an authoritative server that syncs with the datastore. The main goal here would be to minimize GAE daily quotas. For some rough numbers, I am assuming 10,000 entities requiring updates. Each entity update would require: Reading 5 private entity variables (fetched from datastore) Fetching as many as 20 static variables (from datastore or persisted in server memory) Writing 5 entity variables Clients of the game would authenticate and set state directly against GAE as well as pull the latest world state from GAE. Running the update on GAE would consist of a cron job launched every minute. This would update all of the entities and save the results to the datastore. This would be more CPU intensive for GAE. Running the update on an authoritative server would consist of fetching entity data from the GAE datastore, calculating the new entity states and pushing the new state variables back to the datastore. This would be more bandwidth intensive for the datastore.

    Read the article

  • Them and us

    - by Plip
    As much as we try and create inclusive societies throughout the globe time and time again we revert to our tribal and clan origins back in the distant past, be those line split across the obvious like  Nationality, Religion or even the Football teams we follow. Microsoft to me has always been a “them”. I was always on the outside looking in, free to say as I wished and have an external objective viewpoint. Now, after my first week (well four days but who’s counting) Microsoft is an “us” for me. So when I look up in the Atrium of Building 1 at Microsoft’s UK headquarters I see banners like the one above and I already genuinely feel a part of this much bigger community. I looked up at that and I felt a sense of pride to be part of something bigger, something which is out there touching peoples lives everywhere (for the good and the bad). My objectivity has made me who I am today. I’m open to other ideas and concepts, I’ve worked hard to be understanding across the board be it from technology through to cultural differences in my life and it’s vital to me that I preserve that so I now have to learn how I balance the “them” of Microsoft to the “us” of Microsoft and maintain the objectivity. It’s my job to advise people on the best way to do things, which won’t always mean “Use Microsoft Technology X”, sometimes it’ll be my responsibility to say “Don’t use Microsoft Technology X”. My first and foremost responsibility is to the customer, to give them the best advice that I can and I want to maintain that. Yeah, I’m sure I’ll be tarred by some as a Microsoft guy, for many years I’ve had just that, but those out there in the non Microsoft communities I’ve engaged in I think know that I’m the first to say when I think something is a bit naff. So, here’s my ask to you ‘the community’. Keep me honest. If I start to sound like a fanboi I want you to find me and give me a slap. It’s all too easy to forget reality sometimes and I want to make sure I stay well and truly routed in that reality. Also, no matter how much I embed myself within Microsoft I fear I will never understand Microsoft’s marketing team. In the Gents just under the WP7 banner shown above I was faced with this. Draw your own conclusions on what it’s message is.

    Read the article

  • Microsoft Access and Java JDBC-ODBC Error

    - by user1638362
    Trying to insert some values in a Microsoft access database using java. I can an error however, java.sql.SQLException: [Microsoft][ODBC Driver Manager] The specified DSN contains an architecture mismatch between the Driver and Application Exception in thread "main" java.lang.NullPointerException To create the data source im using SysWoW64 odbcad32 and adding it the datasource to system DNS. I say this as i have seen else where there are problems which occur with 64bit systems. However it still doesn't work for me. Microsoft Office 32bit. import java.sql.Connection; import java.sql.DriverManager; import java.sql.SQLException; import java.sql.Statement; public class RMIAuctionHouseJDBC { /** * @param args */ public static void main(String[] args) { String theItem = "Car"; String theClient="KHAN"; String theMessage="1001"; Connection conn =null; // Create connection object try{ Class.forName("sun.jdbc.odbc.JdbcOdbcDriver"); System.out.println("Driver Found"); } catch(Exception e) { System.out.println("Driver Not Found"); System.err.println(e); } // connecting to database try{ String database ="jdbc:odbc:Driver={Microsoft Access Driver (*.accdb)};DBQ=AuctionHouseDatabase.accdb;"; conn = DriverManager.getConnection(database,"",""); System.out.println("Conn Found"); } catch(SQLException se) { System.out.println("Conn Not Found"); System.err.println(se); } // Create select statement and execute it try{ /*String insertSQL = "INSERT INTO AuctionHouse VALUES ( " +"'" +theItem+"', " +"'" +theClient+"', " +"'" +theMessage+"')"; */ Statement stmt = conn.createStatement(); String insertSQL = "Insert into AuctionHouse VALUES ('Item','Name','Price')"; stmt.executeUpdate(insertSQL); // Retrieve the results conn.close(); } catch(SQLException se) { System.out.println("SqlStatment Not Found"); System.err.println(se); } } } java.sql.SQLException: [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified at sun.jdbc.odbc.JdbcOdbc.createSQLException(Unknown Source) at sun.jdbc.odbc.JdbcOdbc.standardError(Unknown Source) at sun.jdbc.odbc.JdbcOdbc.SQLDriverConnect(Unknown Source) at sun.jdbc.odbc.JdbcOdbcConnection.initialize(Unknown Source)

    Read the article

  • Windows Azure: Major Updates for Mobile Backend Development

    - by ScottGu
    This week we released some great updates to Windows Azure that make it significantly easier to develop mobile applications that use the cloud. These new capabilities include: Mobile Services: Custom API support Mobile Services: Git Source Control support Mobile Services: Node.js NPM Module support Mobile Services: A .NET API via NuGet Mobile Services and Web Sites: Free 20MB SQL Database Option for Mobile Services and Web Sites Mobile Notification Hubs: Android Broadcast Push Notification Support All of these improvements are now available to use immediately (note: some are still in preview).  Below are more details about them. Mobile Services: Custom APIs, Git Source Control, and NuGet Windows Azure Mobile Services provides the ability to easily stand up a mobile backend that can be used to support your Windows 8, Windows Phone, iOS, Android and HTML5 client applications.  Starting with the first preview we supported the ability to easily extend your data backend logic with server side scripting that executes as part of client-side CRUD operations against your cloud back data tables. With today’s update we are extending this support even further and introducing the ability for you to also create and expose Custom APIs from your Mobile Service backend, and easily publish them to your Mobile clients without having to associate them with a data table. This capability enables a whole set of new scenarios – including the ability to work with data sources other than SQL Databases (for example: Table Services or MongoDB), broker calls to 3rd party APIs, integrate with Windows Azure Queues or Service Bus, work with custom non-JSON payloads (e.g. Windows Periodic Notifications), route client requests to services back on-premises (e.g. with the new Windows Azure BizTalk Services), or simply implement functionality that doesn’t correspond to a database operation.  The custom APIs can be written in server-side JavaScript (using Node.js) and can use Node’s NPM packages.  We will also be adding support for custom APIs written using .NET in the future as well. Creating a Custom API Adding a custom API to an existing Mobile Service is super easy.  Using the Windows Azure Management Portal you can now simply click the new “API” tab with your Mobile Service, and then click the “Create a Custom API” button to create a new Custom API within it: Give the API whatever name you want to expose, and then choose the security permissions you’d like to apply to the HTTP methods you expose within it.  You can easily lock down the HTTP verbs to your Custom API to be available to anyone, only those who have a valid application key, only authenticated users, or administrators.  Mobile Services will then enforce these permissions without you having to write any code: When you click the ok button you’ll see the new API show up in the API list.  Selecting it will enable you to edit the default script that contains some placeholder functionality: Today’s release enables Custom APIs to be written using Node.js (we will support writing Custom APIs in .NET as well in a future release), and the Custom API programming model follows the Node.js convention for modules, which is to export functions to handle HTTP requests. The default script above exposes functionality for an HTTP POST request. To support a GET, simply change the export statement accordingly.  Below is an example of some code for reading and returning data from Windows Azure Table Storage using the Azure Node API: After saving the changes, you can now call this API from any Mobile Service client application (including Windows 8, Windows Phone, iOS, Android or HTML5 with CORS). Below is the code for how you could invoke the API asynchronously from a Windows Store application using .NET and the new InvokeApiAsync method, and data-bind the results to control within your XAML:     private async void RefreshTodoItems() {         var results = await App.MobileService.InvokeApiAsync<List<TodoItem>>("todos", HttpMethod.Get, parameters: null);         ListItems.ItemsSource = new ObservableCollection<TodoItem>(results);     }    Integrating authentication and authorization with Custom APIs is really easy with Mobile Services. Just like with data requests, custom API requests enjoy the same built-in authentication and authorization support of Mobile Services (including integration with Microsoft ID, Google, Facebook and Twitter authentication providers), and it also enables you to easily integrate your Custom API code with other Mobile Service capabilities like push notifications, logging, SQL, etc. Check out our new tutorials to learn more about to use new Custom API support, and starting adding them to your app today. Mobile Services: Git Source Control Support Today’s Mobile Services update also enables source control integration with Git.  The new source control support provides a Git repository as part your Mobile Service, and it includes all of your existing Mobile Service scripts and permissions. You can clone that git repository on your local machine, make changes to any of your scripts, and then easily deploy the mobile service to production using Git. This enables a really great developer workflow that works on any developer machine (Windows, Mac and Linux). To use the new support, navigate to the dashboard for your mobile service and select the Set up source control link: If this is your first time enabling Git within Windows Azure, you will be prompted to enter the credentials you want to use to access the repository: Once you configure this, you can switch to the configure tab of your Mobile Service and you will see a Git URL you can use to use your repository: You can use this URL to clone the repository locally from your favorite command line: > git clone https://scottgutodo.scm.azure-mobile.net/ScottGuToDo.git Below is the directory structure of the repository: As you can see, the repository contains a service folder with several subfolders. Custom API scripts and associated permissions appear under the api folder as .js and .json files respectively (the .json files persist a JSON representation of the security settings for your endpoints). Similarly, table scripts and table permissions appear as .js and .json files, but since table scripts are separate per CRUD operation, they follow the naming convention of <tablename>.<operationname>.js. Finally, scheduled job scripts appear in the scheduler folder, and the shared folder is provided as a convenient location for you to store code shared by multiple scripts and a few miscellaneous things such as the APNS feedback script. Lets modify the table script todos.js file so that we have slightly better error handling when an exception occurs when we query our Table service: todos.js tableService.queryEntities(query, function(error, todoItems){     if (error) {         console.error("Error querying table: " + error);         response.send(500);     } else {         response.send(200, todoItems);     }        }); Save these changes, and now back in the command line prompt commit the changes and push them to the Mobile Services: > git add . > git commit –m "better error handling in todos.js" > git push Once deployment of the changes is complete, they will take effect immediately, and you will also see the changes be reflected in the portal: With the new Source Control feature, we’re making it really easy for you to edit your mobile service locally and push changes in an atomic fashion without sacrificing ease of use in the Windows Azure Portal. Mobile Services: NPM Module Support The new Mobile Services source control support also allows you to add any Node.js module you need in the scripts beyond the fixed set provided by Mobile Services. For example, you can easily switch to use Mongo instead of Windows Azure table in our example above. Set up Mongo DB by either purchasing a MongoLab subscription (which provides MongoDB as a Service) via the Windows Azure Store or set it up yourself on a Virtual Machine (either Windows or Linux). Then go the service folder of your local git repository and run the following command: > npm install mongoose This will add the Mongoose module to your Mobile Service scripts.  After that you can use and reference the Mongoose module in your custom API scripts to access your Mongo database: var mongoose = require('mongoose'); var schema = mongoose.Schema({ text: String, completed: Boolean });   exports.get = function (request, response) {     mongoose.connect('<your Mongo connection string> ');     TodoItemModel = mongoose.model('todoitem', schema);     TodoItemModel.find(function (err, items) {         if (err) {             console.log('error:' + err);             return response.send(500);         }         response.send(200, items);     }); }; Don’t forget to push your changes to your mobile service once you are done > git add . > git commit –m "Switched to use Mongo Labs" > git push Now our Mobile Service app is using Mongo DB! Note, with today’s update usage of custom Node.js modules is limited to Custom API scripts only. We will enable it in all scripts (including data and custom CRON tasks) shortly. New Mobile Services NuGet package, including .NET 4.5 support A few months ago we announced a new pre-release version of the Mobile Services client SDK based on portable class libraries (PCL). Today, we are excited to announce that this new library is now a stable .NET client SDK for mobile services and is no longer a pre-release package. Today’s update includes full support for Windows Store, Windows Phone 7.x, and .NET 4.5, which allows developers to use Mobile Services from ASP.NET or WPF applications. You can install and use this package today via NuGet. Mobile Services and Web Sites: Free 20MB Database for Mobile Services and Web Sites Starting today, every customer of Windows Azure gets one Free 20MB database to use for 12 months free (for both dev/test and production) with Web Sites and Mobile Services. When creating a Mobile Service or a Web Site, simply chose the new “Create a new Free 20MB database” option to take advantage of it: You can use this free SQL Database together with the 10 free Web Sites and 10 free Mobile Services you get with your Windows Azure subscription, or from any other Windows Azure VM or Cloud Service. Notification Hubs: Android Broadcast Push Notification Support Earlier this year, we introduced a new capability in Windows Azure for sending broadcast push notifications at high scale: Notification Hubs. In the initial preview of Notification Hubs you could use this support with both iOS and Windows devices.  Today we’re excited to announce new Notification Hubs support for sending push notifications to Android devices as well. Push notifications are a vital component of mobile applications.  They are critical not only in consumer apps, where they are used to increase app engagement and usage, but also in enterprise apps where up-to-date information increases employee responsiveness to business events.  You can use Notification Hubs to send push notifications to devices from any type of app (a Mobile Service, Web Site, Cloud Service or Virtual Machine). Notification Hubs provide you with the following capabilities: Cross-platform Push Notifications Support. Notification Hubs provide a common API to send push notifications to iOS, Android, or Windows Store at once.  Your app can send notifications in platform specific formats or in a platform-independent way.  Efficient Multicast. Notification Hubs are optimized to enable push notification broadcast to thousands or millions of devices with low latency.  Your server back-end can fire one message into a Notification Hub, and millions of push notifications can automatically be delivered to your users.  Devices and apps can specify a number of per-user tags when registering with a Notification Hub. These tags do not need to be pre-provisioned or disposed, and provide a very easy way to send filtered notifications to an infinite number of users/devices with a single API call.   Extreme Scale. Notification Hubs enable you to reach millions of devices without you having to re-architect or shard your application.  The pub/sub routing mechanism allows you to broadcast notifications in a super-efficient way.  This makes it incredibly easy to route and deliver notification messages to millions of users without having to build your own routing infrastructure. Usable from any Backend App. Notification Hubs can be easily integrated into any back-end server app, whether it is a Mobile Service, a Web Site, a Cloud Service or an IAAS VM. It is easy to configure Notification Hubs to send push notifications to Android. Create a new Notification Hub within the Windows Azure Management Portal (New->App Services->Service Bus->Notification Hub): Then register for Google Cloud Messaging using https://code.google.com/apis/console and obtain your API key, then simply paste that key on the Configure tab of your Notification Hub management page under the Google Cloud Messaging Settings: Then just add code to the OnCreate method of your Android app’s MainActivity class to register the device with Notification Hubs: gcm = GoogleCloudMessaging.getInstance(this); String connectionString = "<your listen access connection string>"; hub = new NotificationHub("<your notification hub name>", connectionString, this); String regid = gcm.register(SENDER_ID); hub.register(regid, "myTag"); Now you can broadcast notification from your .NET backend (or Node, Java, or PHP) to any Windows Store, Android, or iOS device registered for “myTag” tag via a single API call (you can literally broadcast messages to millions of clients you have registered with just one API call): var hubClient = NotificationHubClient.CreateClientFromConnectionString(                   “<your connection string with full access>”,                   "<your notification hub name>"); hubClient.SendGcmNativeNotification("{ 'data' : {'msg' : 'Hello from Windows Azure!' } }", "myTag”); Notification Hubs provide an extremely scalable, cross-platform, push notification infrastructure that enables you to efficiently route push notification messages to millions of mobile users and devices.  It will make enabling your push notification logic significantly simpler and more scalable, and allow you to build even better apps with it. Learn more about Notification Hubs here on MSDN . Summary The above features are now live and available to start using immediately (note: some of the services are still in preview).  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • ReSharper - Possible Null Assignment when using Microsoft.Contracts

    - by HVS
    Is there any way to indicate to ReSharper that a null reference won't occur because of Design-by-Contract Requires checking? For example, the following code will raise the warning (Possible 'null' assignment to entity marked with 'NotNull' attribute) in ReSharper on lines 7 and 8: private Dictionary<string, string> _Lookup = new Dictionary<string, string>(); public void Foo(string s) { Contract.Requires(!String.IsNullOrEmpty(s)); if (_Lookup.ContainsKey(s)) _Lookup.Remove(s); } What is really odd is that if you remove the Contract.Requires(...) line, the ReSharper message goes away. Update I found the solution through ExternalAnnotations which was also mentioned by Mike below. Here's an example of how to do it for a function in Microsoft.Contracts: Create a directory called Microsoft.Contracts under the ExternalAnnotations ReSharper directory. Next, Create a file called Microsoft.Contracts.xml and populate like so: <assembly name="Microsoft.Contracts"> <member name="M:System.Diagnostics.Contracts.Contract.Requires(System.Boolean)"> <attribute ctor="M:JetBrains.Annotations.AssertionMethodAttribute.#ctor"/> <parameter name="condition"> <attribute ctor="M:JetBrains.Annotations.AssertionConditionAttribute.#ctor(JetBrains.Annotations.AssertionConditionType)"> <argument>0</argument> </attribute> </parameter> </member> </assembly> Restart Visual Studio, and the message goes away!

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >