Search Results

Search found 1517 results on 61 pages for 'migrate'.

Page 37/61 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • PC On/Off Time Charts Windows Uptime; No Logging Necessary

    - by Jason Fitzpatrick
    Windows: PC On/Off Time is a graphical tool that displays your PC’s uptime, downtime, errors, and more all in a clear and portable package. One of the hassles of using logging tools is that you usually have to enable the logging and then wait for results to pile up before seeing anything useful (such as when you turn on the logging on your router). PC On/Off Time taps right into the event logs your Windows PC is already keeping so you get immediate access to your uptime history. If you look at the screenshot above you can see an accurate picture of the last few weeks of uptime on my computer. October 23-24 I didn’t boot down my PC, the rest of the time I hibernated it overnight when I wasn’t using it, November 1st I installed an SSD (you can see the burst of reboots and short uptimes) and then November 9th there was a brief power outage that caused an unexpected stop (the red arrows on the timeline for the 9th). The free version offers a three-week peek back into your uptime history (upgrade to the Pro version for $12.75 or for free using Trial Pay to unlock your completely uptime history).PC On/Off Time is Windows only. PC On/Off Time [via Addictive Tips] Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone How To Migrate Windows 7 to a Solid State Drive Follow How-To Geek on Google+

    Read the article

  • Gnome-Network-Manager Config File Migration

    - by Jorge
    I think I have an issue with gnome-network-manager, I used to have a lot of connections configured, Wireless, Wired and VPN. After upgrading to 12.04 (from 11.10) I lost every configuration. I realized that the configs that used to be saved in $HOME/.gconf/system/networking/connections now are being saved in /etc/NetworkManager/system-connections/. I don't know how to migrate my settings to the new config file format Can anybody help me? jorge@thinky:~$ sudo lshw -C network *-network description: Ethernet interface product: 82566MM Gigabit Network Connection vendor: Intel Corporation physical id: 19 bus info: pci@0000:00:19.0 logical name: eth0 version: 03 serial: 00:1f:e2:14:5a:9b capacity: 1Gbit/s width: 32 bits clock: 33MHz capabilities: pm msi bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=e1000e driverversion=1.5.1-k firmware=0.3-0 latency=0 link=no multicast=yes port=twisted pair resources: irq:46 memory:fe000000-fe01ffff memory:fe025000-fe025fff ioport:1840(size=32) *-network description: Wireless interface product: PRO/Wireless 4965 AG or AGN [Kedron] Network Connection vendor: Intel Corporation physical id: 0 bus info: pci@0000:03:00.0 logical name: wlan0 version: 61 serial: 00:21:5c:32:c2:e5 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=iwl4965 driverversion=3.2.0-23-generic-pae firmware=228.61.2.24 ip=192.168.2.103 latency=0 link=yes multicast=yes wireless=IEEE 802.11abgn resources: irq:47 memory:df3fe000-df3fffff jorge@thinky:~$ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04 LTS Release: 12.04 Codename: precise jorge@thinky:~$ uname -a Linux thinky 3.2.0-23-generic-pae #36-Ubuntu SMP Tue Apr 10 22:19:09 UTC 2012 i686 i686 i386 GNU/Linux jorge@thinky:~$ dpkg -l | grep -i firm ii linux-firmware 1.79 Firmware for Linux kernel drivers

    Read the article

  • Getting Internal Name of a Share Point List Fields

    - by Gino Abraham
    Over the last 2 weeks i was developing a tool to migrate Lotus notes data base to Share point. The mapping between Lotus notes schema and share point list schema was done manually in an xml file for out tool. To map the columns we wanted internal names of each field. There are quite a few ways to achieve this, have explained few below. If you want internal names for one or 2 columns you can do so by navigating to the list setting and clicking on the column name. Once you are in column's details, you can check the query string of the page. The last item in the query string would be field's internal. Replace all "%5f" with '_' will give you the field internal name. In my case there were more than 80 columns. I used power shell to get the list of columns with details. Open windows Powershell and paste the following script after modifying the url and list name. [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") $site = new-object Microsoft.SharePoint.SPSite(http://yousitecolurl) $web = $site.OpenWeb() $list = $web.Lists["yourlist name"] $list.Fields | Format-Table Title, InternalName, TypeAsString I also found a tool in Codeplex.com which can generate a wrapper class for a list. The wrapper class will give you the guid and internal name for all fields in the list.  You can download the tool from http://imtech.codeplex.com/ Just enter the url in the text box and hit open. All the site content will be listed at the left hand side, expand the list, right click and select generate wrapper class.

    Read the article

  • Configuration tools for multiple monitors for X / Linux

    - by richard
    I have Ubuntu 10.04 running gnome and two monitors. I am wondering if a can get a better multi-monitor configuration tool. The one I have, gnome-display-properties, has too many problems, including: When I swapped my monitors over, the narrower (external) one now on the left. There is a width calculation error, such that I have a virtual monitor the width of the wide-monitor on the narrow-monitor and part of the wide monitor. And a virtual narrow-monitor on the remainder of the wide-monitor. Also the visible mouse pointer does is not aligned with the active spot, an x offset of one monitor width. I would like, in approximate order of importance: nobugs. to be able to select which is primary monitor. to have multiple configurations. configurations to be automatically selected based on which monitors are attached. configurations to be cycled (reliably) when display mode key is pressed. when a display is deactivated, for windows to migrate to remaining monitors. option to not change display resolution when mirroring, but to use side/top blanking bars to pad out screen.

    Read the article

  • Migrating BizTalk 2006 R2 to BizTalk 2010 XLANGs Issue

    - by SURESH GIRIRAJAN
    When we migrate some BizTalk apps from BizTalk 2006 R2 to BizTalk 2010, and we ran into issue when a .net component called inside the orchestration. In the .net component we are trying to retrieve some promoted property and we also checked in the BizTalk group hub to validate it was promoted, no issues there.  Only when we try to access the data into the .net component we had issue. We just moved all the assembly what we had in BizTalk 2006 R2 to BizTalk 2010, didn’t recompile anything in BizTalk 2010 environment. But looking further there is couple of new namespace added to the Microsoft.XLANGs… in BizTalk 2010 compared to BizTalk 2006 R2 caused the issue. So all we did to fix the issue is recompile the project in 2010 environment and it worked fine. So it looks like some backward compatibility issue.  public static void Load(XLANGMessage msg) {  try  {      // get the process id from context.       object ctxVal = msg.GetPropertyValue(typeof(ProcessID)); … } BizTalk 2010: Error Message in the event viewer:  The service instance will remain suspended until administratively resumed or terminated. If resumed the instance will continue from its last persisted state and may re-throw the same unexpected exception. InstanceId: 441d73d3-2e84-49d2-b6bd-7218065b5e1d Shape name: Bulk Load ShapeId: bb959e56-9221-48be-a80f-24051196617d Exception thrown from: segment 1, progress 65 Inner exception: A property cannot be associated with the type 'Tellago.Common.Schemas.ProcessId'.   Exception type: InvalidPropertyTypeException Source: Microsoft.XLANGs.Engine Target Site: Microsoft.XLANGs.RuntimeTypes.MessagePropertyDefinition _getMessagePropertyDefinition(System.Type) The following is a stack trace that identifies the location where the exception occured   at Microsoft.XLANGs.Core.XMessage._getMessagePropertyDefinition(Type propType) at Microsoft.XLANGs.Core.XMessage.GetContentProperty(Type propType) at Microsoft.XLANGs.Core.XMessage.GetPropertyValue(Type propType) at Microsoft.BizTalk.XLANGs.BTXEngine.BTXMessage.GetPropertyValue(Type propType) at Microsoft.XLANGs.Core.MessageWrapperForUserCode.GetPropertyValue(Type propType) at Tellago.Common.Components.Load(XLANGMessage msg) at Tellago.SuspensionProcess.segment1(StopConditions stopOn) at Microsoft.XLANGs.Core.SegmentScheduler.RunASegment(Segment s, StopConditions stopCond, Exception& exp)

    Read the article

  • URL length and content optimised for SEO

    - by Brendan Vogt
    I have done some reading on what URLS should look like for search engine optimisation, but I am curious to know how mine would like, I need some advice. I have a tutorial website, and my categories is something like: Web Development -> Client Side -> JavaScript So if I have a tutorial called "What is JavaScript?", is it good to have a URL that looks something like: www.MyWebsite.com/web-development/client-side/javascript/what-is-javascipt Or would something like this be more appropriate: www.MyWebsite.com/tutorials/what-is-javascipt Just curious because I also read that it is wise to have keywords in your URLs. Do I need to add the identifiers of each categories in the link as well, something like: www.MyWebsite.com/1/web-development/5/client-side/15/javascript/100/what-is-javascipt 1 is the unique identifier (primary key) of category web development 5 is the unique identifier (primary key) of category client side 15 is the unique identifier (primary key) of category javascript 100 is the unique identifier (primary key) of tutorial what is javascript UPDATE This is not a programming question so can someone please help migrate this to the correct Q&A site without devoting my questions?

    Read the article

  • What is a generic term for name/identifier? (as opposed to label)

    - by d3vid
    I need to refer to a number of things that have both an identifier value (used in code and configuration), and a human-readable label. These things include: database columns dropdown items subapplications objects stored in a dictionary I want two unambiguous terms. One to refer to the identifier/value/key. One to refer to the label. As you can see, I'm pretty settled on the latter :) For the former, identifier seems best (not everything is strictly a key, and value and name could refer to the label; although, identifier usually refers only to a variable name), but I would prefer to follow an established practice if there is one. Is there an established term for this? (Please provide a source.) If not, are there any examples of a choice from a significant source (Java APIs, MSDN, a big FLOSS project)? (I wasn't sure if this should be posted here or to English Language & Usage. I thought this was the more appropriate expert audience. Happy to migrate if not.)

    Read the article

  • Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter

    - by Kellsey Ruppel
    Register Now for this Webcast. Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter Los Angeles Department of Building & Safety (LADBS) is one of the largest construction permitting departments in the country, serving over 350,000 walk-in and 530,000 phone customers, and issuing over 110,000 permits worth $3 Billion every year. LADBS needed a way to migrate walk-in and phone transactions to customer self-service, so they turned to Oracle WebCenter and teamed with Oracle Partner 3Di to deliver a customer self-service portal to lower their cost of customer service operation, while increasing customer satisfaction. Attend this Webcast to learn how Oracle WebCenter has allowed Los Angeles Department of Building & Safety to: Deliver a state of the art customer self-service portal Reduce traffic on high cost, low satisfaction customer service channels Integrate business workflows and legacy applications Register Now for this Webcast. REGISTER NOW Register now for this exclusive event. Wednesday, November 14, 2012 10 a.m. PT / 1 p.m. ET Presented by: Giovani DacumosDirector of Systems, Los Angeles Department of Building & Safety Jing ReyesApplications Development Group Manager, Los Angeles Department of Building & Safety Rajiv Desai CEO, 3Di Sheetal ParanjpyeProject Manager, 3Di Presented by: Copyright © 2012, Oracle. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement

    Read the article

  • Site migration and SEO impact

    - by John Smith
    I'd greatly appreciate a response on the following question relating to site migration and SEO impact. Here's some background on how my domain name and site is currently configured: My domain name provider has the following settings: host name @ is an A NAME record and points to IP address x.x.x.x host name www is an A NAME record and points to IP address x.x.x.x sub-domain host name new.example.com is an A NAME record and points to IP address x.x.x.x My hosting provider has the following settings: host record @ is an A NAME record and points to IP address x.x.x.x, folder home/public_html/old host record www is a C NAME record and points to example.com sub-domain host record new.example.com points to home/public_html/new I want to: point the domain (example.com AND www.example.com) to the content hosted under folder home/public_html/new, which is currently the content directory for new.example.com retire the content hosted under folder home/public_html/old retire the sub-domain host record new.example.com I believe the easiest method of doing this, is: removing the sub-domain host record new.example.com; and changing the following line in the .htaccess file in home/public_html from # Change 'subdirectory' to be the directory you will use for your main domain. RewriteCond %{REQUEST_URI} !^/old/ to # Change 'subdirectory' to be the directory you will use for your main domain. RewriteCond %{REQUEST_URI} !^/new/ But I don't understand how this will impact my SERP - ideally, I'd like it to remain the same. Research on this topic resulted in the following Google page, which was no help, and this related StackExchange question, which suggests that this should not affect my SERP (at least, not permanently). But I wanted to make certain with a more specific example, and hopefully contribute to the community at the same time. I'd appreciate any feedback on this. Is there a better/recommended method to migrate sites this way? Is there an SEO impact?

    Read the article

  • excel vba to CRUD drupal nodes

    - by Kirk Hings
    We need to periodically migrate Excel reports data into Drupal nodes. We looked at replicating some Excel functionality in Drupal with slickgrid, but it wasn't up to snuff. The Excel reports people don't want to double-enter their data, but their data is important to be in this Drupal site. They have hundreds of Excel reports, and update a row in each weekly. We want a button at the row end to fire a VBA macro that submits the data to Drupal, where a new node is created from the info submitted. (Yes, we are experienced with both Drupal and VBA; all users and the site are behind our firewall.) We need the new node's nid or URL returned so we can then create a link in Excel directly to that node Site is D6, using Services 3.x module. I tried the REST server module, but we can't get it to retrieve data without session authentication on, which we can't do from Excel. (unless you can?) I also noticed the 'data' it was returning via browser url was 14 or 20 nodes' info, not the one nid requested (Example: http://mysite.com/services/rest/report/node/30161) When I attempt to create a simple node like this from VBA: Dim MyURL as String MyURL = "http://mysite.com/services/rest/report/node?node[type]=test&node[title]=testing123&node[field_test_one][0][value]=123" Set objHTTP = CreateObject("MSXML2.ServerXMLHTTP") With objHTTP .Open "POST", MyURL, False .setRequestHeader "Content-Type", "application/x-www-form-urlencoded" .send (MyURL) End With I get HTTP Status: Unauthorized: Access denied for user 0 "anonymous" and HTTP Response: null Everything I search for has examples in php or java, nothing in VBA. Also tried switching to using an XMLRPC server but that's even more confusing. We would like json (used application/json, set formatter accordingly in REST server settings), but will use anything that works. Ideas? Thanks in advance!

    Read the article

  • FY11 plans &ndash; how can you increase your SOA business?

    - by Jürgen Kress
    Thanks for a fantastic FY10 was great to work with all of you! Yes with the economic crises the fiscal year was hard. SOA and Oracle Fusion Middleware do address this challenges and can help companies to save cost to integrate their systems, automate and change their processes. More when we publish our fiscal year results. What is on the agenda for FY11? Specialization: It is key that you become SOA & Application Grid Specialized. We will focus our activities and budgets on partners with Specialization! Sales campaigns: To support you in our joint business we will continue to run joint sales campaigns. With OFM 11g there is a great opportunity to generate service revenue to migrate and to consolidate on the platform. It is key that you do register your opportunities within the Open Market Model (OMM) to ensure sales alignment. Enablement. With the release of many new products and versions training is key. We will continue to offer training dedicated to your role: sales, pre-sales and implementation. Make sure that you check local partner training calendars and sign up for the next bootcamps Thanks for your support! Jürgen Kress

    Read the article

  • Efficient use of Bundling

    - by ACShorten
    One of the discussions I am having with customers and consulting people is about the use of Bundling and its appropriate use. We introduced Bundling post release in the V2.2 code line to allow partners and consultants to build solutions using the Configuration Tools objects such as UI Maps, Service Scripts, Business Objects, Business Services etc and then export and migrate them as solutions. Whilst that was the original intent I have found a few teams using the facility for other data and then complaining about the efficiency or relevance of the tool. Here are a number of guidelines to help optimize the use of Bundling for your implementation: Not all objects can be bundled. Only specific objects in the product can be bundled. These are targetted at Configuration Tools objects and a select group of other objects that are required for these objects. Maintenance Objects with the option "Eligble for Bundling" set to Y (and also contains a Bundling Add BO). Add objects to the Bundle as you complete them - Bundling can have issues with sequencing objects. The best way of combating this is to add objects to the bundle as you complete them. This will help with making sure you sequence the loading of the objects as you are building them in the correct order. Remember Bundling was designed for developers and partners to deliver solutions. If you leave adding objects to a Bundle using the Bundle Export zones then you will have less control of what sequence they are applied and this can cause timing issues. Bundling takes the latest revision  - If you combine Bundling with Revision Control then the Bundling will take the latest release of the object at the time of the export operation. Bundling and Version Control products - If you use a version control tool to control your java code then you can also check in the Bundle to associate a release between code and a bundle. Bundling is quite a powerful feature of the Oracle Utilities Application Framework that allows sales, partners, consultants and customers to package and import their Configuration Tools based solutions.

    Read the article

  • A few unpleasant facts about Visual Studio 2012.

    - by Ilya Verbitskiy
    I have been playing with Visual Studio 2012 for the last couple of days. New IDE is pretty good, but, unfortunately, I found a few unpleasant facts. First of all, new release is coming without Visual Studio setup projects. I am disappointed, because I am using it for my pet project – Easy Shutdown. The tool is a small widget-like application which allows you to reboot, log out or shut down you PC. I have not done any decision yet, but I would probably migrate to WiX. The second surprise is Microsoft will not add Visual Studio macros to the next release. Since I am lazy guy, I like small hacks using macros. For example I have macros to refresh all projects or attach to IIS.  The only way how to solve the problem is to convert your macros to Visual Studio plugin. I have not tried it yet, but I will definitely do in the nearest future. The third fact, I do not like, is Visual Studio default themes. May be somebody like it, but they are hard to adopt after Visual Studio 2010. Fortunately there is a solution. Matthew Johnson released Visual Studio 2012 Color Theme Editor. It comes with a few predefined themes. I really like the Blue one.

    Read the article

  • New partnership allows auto-transposition of client/server application to Windows Azure

    - by Webgui
    The economics of IT is changing rapidly, and organizations are searching to widen and secure availability of their systems and at the same time lower costs which is exactly what the cloud meant to do. Running your systems on Microsoft’s Windows Azure cloud for example would improve and secure the availability, accessibility and scalability (both up and down) of your systems and support the new IT economics. However, in order to take advantage of the cloud's promise of lower cost of ownership, the applications must be built or adjusted to work on that platform and in most cases this is not a simple task.  Even existing web applications cannot always be transferred to Azure without some changes, and for client/server applications, the task is way more challenging even to the point where it seems impossible. The reason is the gaps between the client/server desktop technology and the cloud's. For that reason, most of the known methodologies to migrate existing client/server applications actually involve rewrite of the desktop systems for the cloud. A unique approach is introduced by Visual WebGui which creates a virtualization layer atop ASP.Net web server, it moves the transformed or generated .Net code to that layer, and then using a patent pending protocol it renders a user interface within a plain browser. The end result is pure .NET code that is a base code for a pure rich web application and now due to a collaboration with Microsoft Windows Azure Visual WebGui provides the shortest path from client/server to the Azure cloud by being able to handle close to 95% of the transformation to the cloud platform in an automatic way. Application Migration to Azure without migraines More information about the Instant CloudMove Azure solution here.

    Read the article

  • Migrating Forms to Java or ADF, the truth and no FUD

    - by Grant Ronald
    The question about migrating Forms to Java (or ADF or APEX) comes up time and time again.  I wanted to pull some core information together in a single blog post to address this question. The first question I always ask is "WHY" - Forms may still be a viable option for you so "if it ain't broke don't fix it".  Bottom line is whatever anyone tells you, its going to be a considerable effort and cost to migrate from Forms to something else so the business is going to want to know WHY you spend all those hard earned dollars switching from something that might have been serving you quite adequately. Second point, if you are going to switch, I would encourage you NOT to look at building a Forms clone.  So many times I see people trying to build an ADF application and EXACTLY mimic the Forms model - ADF is NOT a Forms clone.  You should be building to the sweet spot of your target technology, not your 20 year old client/server technology.  This is also the chance for the business to embrace change, so maybe look at new processes, channels and technology options that weren't available when you first developed your Forms applications. To help you understand what is involved, I've put together a number of resources. Thinking about migration of Forms to Java, ADF or APEX, read this to prepare yourself Oracle Forms to ADF: When, Why and How - this gives you an overview of our vision, directly from Oracle Product Management Redeveloping a Forms Application with Oracle JDeveloper and Oracle ADF.  This is a conference session from myself and Lynn Munsinger on how ADF can be used in a Forms migration/rewrite As someone who manages both Forms and ADF Product Management teams, I've a foot in either camp and am happy to see you use either tool.  However, I want you to be able to make an informed decision.  My hope is that there information sources will help you do that.

    Read the article

  • Sync KeePassX with KeePass2

    - by bioShark
    Simply put: In Ubuntu I am using KeePassX and in Windows KeePass2. In am not able to export/import passwords from one to another. I would prefer to use the same database, but I don't really know how. If there is no possibility to sync the 2, can you recommend another password vault, which is able to sync passwords from 2 OS, using a shared DB. Thanks I am using Ubuntu 12.04 and Win 7. Edit: I have noticed that KeePass2 is available in the Software Center, so I have installed it, and I can successfully open my Win7 database. Now I will migrate my KeePassX passwords. I am seeing now a huge difference in the looks. While KeePassX doesn't exactly have Ubuntu like look&feel, it's 100 times more elegant than the interface KeePass2 comes with. Well, maybe that was my initial decision for installing KeePassX on my Ubuntu machine. I can't remember. @fossfreedom, please add your comment as a response, so that I can accept it. Thank for the suggestion

    Read the article

  • Updating an interface to bootstrap

    - by Anagio
    I'm updating a web apps interface to bootstrap. There's a lot of existing CSS and Javascript/jQuery i'll have to migrate, most i'll scrap and use bootstraps. But for parts of the app that use datatables and such all that code has to be migrated. I'm working on a development server. The app has a header.phtml sidebar.phtml and a lot of content area view files. Right now i'm building static versions of view files say the header. I then open my existing header.phtml into notepad++ split screen with the static file and copy over the dynamic code. Then replace the old header.phtml with the one I just made. To make sure the header displayed correctly I had to add all the CSS and JS from bootstrap. This is conflicting with the current CSS styles and some JS conflicts as well. Should I go through the app note what JS I absolutely need what I don't and same with the CSS. Then strip all the CSS/JS from the old app that is not needed so it only has bootstraps and any other critical files and not worry about the way pages look as i'm making progress to updating them. I'd be working on mostly a wireframe of the old site without any styles until I get to applying bootstraps. Is this efficient or is there another way I can get through all these files and update them easily?

    Read the article

  • You Don't Want to Meet Orgad Kimchi in a Dark Alley

    - by rickramsey
    source Do you remember what those bad guys in the old Charles Bronson films looked like? They looked like Orgad Kimchi, that's what they looked like. When I met him at Oracle OpenWorld 2012, I realized I didn't want to meet him in the wrong alleyway of Budapest after dark. Neither do old versions of Oracle Solaris, which Orgad bends to his will with as much ease as he probably bends stray tourists to his will in Budapest, Kandahar, or Dagestan. How Orgad Made Oracle Database Migrate from Oracle Solaris 8 to Oracle Solaris 11 In this article, which we liked so much we reprinted it from his blog (please don't tell him!), Orgad explains how he head-butted an Oracle Database into submission. The database thought it was safe running in Oracle Solaris 8, but Orgad dragged its whimpering carcas into Oracle Solaris 11. How'd he do that? Well, if you had met Orgad in person, you wouldn't ask that question. Because you'd know he could have simply stared at it, and the database would have migrated on its own. But Orgad didn't do that. Instead, he stuffed an Oracle Solaris 8 Physical-to-Virtual (P2V) Archiver Tool into his leather trench coat, the one with the special pockets sown in by the East German Secret Police for several Uzis and their ammo, and walked into his data center in a way that reminded the survivors of this clip from Matrix Reloaded. The end result? The Oracle Database 10.2 that was running on Oracle Solaris 8 is now running inside a Solaris 10 branded zone in Oracle Solaris 11. With no complaints. Don't make Orgad angry. Read his article. - Rick Website Newsletter Facebook Twitter

    Read the article

  • SQL – What is the latest Version of NuoDB? – A Quick Contest to Get Amazon Gift Cards

    - by Pinal Dave
    We had a great contest earlier last week - What ACID stands in the Database? – Contest to Win 24 Amazon Gift Cards and Joes 2 Pros 2012 Kit. It has received quite a few responses. Just like any other contest, not everyone was winner. The kind folks at NuoDB decided to give another chance to everyone who have not won in the last contest. This means if you have missed to take part in the earlier contest or if you have taken part and not won, you still have one more chance to win Amazon Gift Card. Here is the quick contest: You just have to go and download NuoDB. The first 10 people who will download the NuoDB will get 10 – USD 10 cards. Remaining everyone will be entered into a lucky draw of Amazon Gift cards of USD 50. Winners will be announced in next 24 hours. Bonus Round: If you have entered in the contest above, you can also enter to win latest Beginning SSRS Joes 2 Pros book. You just have to leave a comment over here with your experience about your experience with NuoDB and what is the latest version of the product. Here are few of the blog post I wrote earlier on that subject: Part 1 – Install NuoDB in 90 Seconds Part 2 – Manage NuoDB Installation Part 3 – Explore NuoDB Database Part 4 – Migrate from SQL Server to NuoDB Part 5 - NuoDB and Third Party Explorer – SQuirreL SQL Client, SQL Workbench/J and DbVisualizer Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Archbeat Link-O-Rama Top 10 Tweets for October 2013

    - by OTN ArchBeat
    What caught the attention of the 1,988 people who follow @OTNArchBeat last month? The answer is below, in the list of Top 10 Tweets for October 2013 RT @java: Which women in tech inspire you? Blog about them on Ada Lovelace Day! #ALD13 Oct 10, 2013 at 11:14 AM RT @ORCL_Linux: New blog post: Announcing Unbreakable Enterprise Kernel Release 3 for #Oracle #Linux Oct 21, 2013 at 07:11 PM RT @glassfish: Quick & Dirty How-to Guide: Install #GlassFish 4 on #RaspberryPi. Creating an #IoT infra via @MkHeck Oct 27, 2013 at 07:19 PM RT @java: Nighthacking with James Gosling, interview from Hawaii, watch live Oct. 23, 11am PT #java Oct 21, 2013 at 11:26 AM RT @ensode: "Oracle has posted blogs on how to migrate from #Spring to #JavaEE" I wrote the linked article Oct 07, 2013 at 10:53 AM SOA and User Interfaces - by @soacommunity @hajonormann @gschmutz @t_winterberg et al #industrialsoa Oct 03, 2013 at 01:17 PM RT @oracleace: Welcome and congrats to new #ACEDs @kevin_mcginley and Rene van Wijk @MiddlewareMagic Oct 25, 2013 at 12:59 PM SOA in Real Life: Mobile Solutions by @soacommunity @HajoNormann @gschmutz @t_winterberg et al #industrialsoa Oct 28, 2013 at 09:23 AM RT @OracleAnalytics: Curious to how big #oow13 was? Here’s an infographic to show you some of the stats. Oct 25, 2013 at 01:13 PM Free Poster: ACM in Practice >> thanks to @dschmeid @hajonormann @torsten_winterberg @tbmaier @gschmutz et al. Oct 16, 2013 at 09:56 AM Thought for the Day "You can converge a toaster and a refrigerator, but those things are probably not going to be pleasing to the user." — Tim Cook, CEO of Apple Inc. (Born November 1, 1960) Source: brainyquote.com

    Read the article

  • Finally! Entity Framework working in fully disconnected N-tier web app

    - by oazabir
    Entity Framework was supposed to solve the problem of Linq to SQL, which requires endless hacks to make it work in n-tier world. Not only did Entity Framework solve none of the L2S problems, but also it made it even more difficult to use and hack it for n-tier scenarios. It’s somehow half way between a fully disconnected ORM and a fully connected ORM like Linq to SQL. Some useful features of Linq to SQL are gone – like automatic deferred loading. If you try to do simple select with join, insert, update, delete in a disconnected architecture, you will realize not only you need to make fundamental changes from the top layer to the very bottom layer, but also endless hacks in basic CRUD operations. I will show you in this article how I have  added custom CRUD functions on top of EF’s ObjectContext to make it finally work well in a fully disconnected N-tier web application (my open source Web 2.0 AJAX portal – Dropthings) and how I have produced a 100% unit testable fully n-tier compliant data access layerfollowing the repository pattern. http://www.codeproject.com/KB/linq/ef.aspx In .NET 4.0, most of the problems are solved, but not all. So, you should read this article even if you are coding in .NET 4.0. Moreover, there’s enough insight here to help you troubleshoot EF related problems. You might think “Why bother using EF when Linq to SQL is doing good enough for me.” Linq to SQL is not going to get any innovation from Microsoft anymore. Entity Framework is the future of persistence layer in .NET framework. All the innovations are happening in EF world only, which is frustrating. There’s a big jump on EF 4.0. So, you should plan to migrate your L2S projects to EF soon.

    Read the article

  • Oracle VM server for SPARC 2.2 on S11

    - by Liam Merwick
    Oracle VM Server for SPARC 2.2 has been released for a little while now. The https://blogs.oracle.com/virtualization blog has an overview of all the 2.2 features. Initially, what was released was the SVR4 package for Solaris 10 (which is unbundled and wasn't constrained by any external schedule). On Solaris 11, the 'ldomsmanager' package is built into Solaris (and therefore doesn't need to be downloaded separately) so it is delivered as part of an S11 Support Repository Update (SRU). Some of the features in 2.2 are specific to S11 (SR-IOV and the ability to live migrate between machines with different CPU types) and so there have been many requests to know when are the S11 bits coming. Solaris 11 SRU8.5 was released on Friday and this includes Oracle VM server for SPARC 2.2 so if you're already running an S11 SRU all you need do is a 'pkg update' to get the 2.2 bits. If you're still running the original S11 and your 'pkg publisher' output shows the /release repository then you'll need to sign up for the /support repo by getting the appropriate keys and certificates to access the repository (requires a support contract). The 2.2 Admin Guide documents how to do this upgrade on S11 Two S11 articles which have some useful details on upgrading (not just 'ldomsmanager') via the support repositories are: How to Update Oracle Solaris 11 Systems From Oracle Support Repositories by Glynn Foster Tips for Updating Your Oracle Solaris 11 System from the Oracle Support Repository by Peter Dennis In particular, if you'd like to stick with the v2.1 release when upgrading to SRU8.5 or greater, see the 'pkg freeze' section of Peter's article.

    Read the article

  • Meet the New Windows Azure

    - by ScottGu
    Today we are releasing a major set of improvements to Windows Azure.  Below is a short-summary of just a few of them: New Admin Portal and Command Line Tools Today’s release comes with a new Windows Azure portal that will enable you to manage all features and services offered on Windows Azure in a seamless, integrated way.  It is very fast and fluid, supports filtering and sorting (making it much easier to use for large deployments), works on all browsers, and offers a lot of great new features – including built-in VM, Web site, Storage, and Cloud Service monitoring support. The new portal is built on top of a REST-based management API within Windows Azure – and everything you can do through the portal can also be programmed directly against this Web API. We are also today releasing command-line tools (which like the portal call the REST Management APIs) to make it even easier to script and automate your administration tasks.  We are offering both a Powershell (for Windows) and Bash (for Mac and Linux) set of tools to download.  Like our SDKs, the code for these tools is hosted on GitHub under an Apache 2 license. Virtual Machines Windows Azure now supports the ability to deploy and run durable VMs in the cloud.  You can easily create these VMs using a new Image Gallery built-into the new Windows Azure Portal, or alternatively upload and run your own custom-built VHD images. Virtual Machines are durable (meaning anything you install within them persists across reboots) and you can use any OS with them.  Our built-in image gallery includes both Windows Server images (including the new Windows Server 2012 RC) as well as Linux images (including Ubuntu, CentOS, and SUSE distributions).  Once you create a VM instance you can easily Terminal Server or SSH into it in order to configure and customize the VM however you want (and optionally capture your own image snapshot of it to use when creating new VM instances).  This provides you with the flexibility to run pretty much any workload within Windows Azure.   The new Windows Azure Portal provides a rich set of management features for Virtual Machines – including the ability to monitor and track resource utilization within them.  Our new Virtual Machine support also enables the ability to easily attach multiple data-disks to VMs (which you can then mount and format as drives).  You can optionally enable geo-replication support on these – which will cause Windows Azure to continuously replicate your storage to a secondary data-center at least 400 miles away from your primary data-center as a backup. We use the same VHD format that is supported with Windows virtualization today (and which we’ve released as an open spec), which enables you to easily migrate existing workloads you might already have virtualized into Windows Azure.  We also make it easy to download VHDs from Windows Azure, which also provides the flexibility to easily migrate cloud-based VM workloads to an on-premise environment.  All you need to do is download the VHD file and boot it up locally, no import/export steps required. Web Sites Windows Azure now supports the ability to quickly and easily deploy ASP.NET, Node.js and PHP web-sites to a highly scalable cloud environment that allows you to start small (and for free) and then scale up as your traffic grows.  You can create a new web site in Azure and have it ready to deploy to in under 10 seconds: The new Windows Azure Portal provides built-in administration support for Web sites – including the ability to monitor and track resource utilization in real-time: You can deploy to web-sites in seconds using FTP, Git, TFS and Web Deploy.  We are also releasing tooling updates today for both Visual Studio and Web Matrix that enable developers to seamlessly deploy ASP.NET applications to this new offering.  The VS and Web Matrix publishing support includes the ability to deploy SQL databases as part of web site deployment – as well as the ability to incrementally update database schema with a later deployment. You can integrate web application publishing with source control by selecting the “Set up TFS publishing” or “Set up Git publishing” links on a web-site’s dashboard: Doing do will enable integration with our new TFS online service (which enables a full TFS workflow – including elastic build and testing support), or create a Git repository that you can reference as a remote and push deployments to.  Once you push a deployment using TFS or Git, the deployments tab will keep track of the deployments you make, and enable you to select an older (or newer) deployment and quickly redeploy your site to that snapshot of the code.  This provides a very powerful DevOps workflow experience.   Windows Azure now allows you to deploy up to 10 web-sites into a free, shared/multi-tenant hosting environment (where a site you deploy will be one of multiple sites running on a shared set of server resources).  This provides an easy way to get started on projects at no cost. You can then optionally upgrade your sites to run in a “reserved mode” that isolates them so that you are the only customer within a virtual machine: And you can elastically scale the amount of resources your sites use – allowing you to increase your reserved instance capacity as your traffic scales: Windows Azure automatically handles load balancing traffic across VM instances, and you get the same, super fast, deployment options (FTP, Git, TFS and Web Deploy) regardless of how many reserved instances you use. With Windows Azure you pay for compute capacity on a per-hour basis – which allows you to scale up and down your resources to match only what you need. Cloud Services and Distributed Caching Windows Azure also supports the ability to build cloud services that support rich multi-tier architectures, automated application management, and scale to extremely large deployments.  Previously we referred to this capability as “hosted services” – with this week’s release we are now referring to this capability as “cloud services”.  We are also enabling a bunch of new features with them. Distributed Cache One of the really cool new features being enabled with cloud services is a new distributed cache capability that enables you to use and setup a low-latency, in-memory distributed cache within your applications.  This cache is isolated for use just by your applications, and does not have any throttling limits. This cache can dynamically grow and shrink elastically (without you have to redeploy your app or make code changes), and supports the full richness of the AppFabric Cache Server API (including regions, high availability, notifications, local cache and more).  In addition to supporting the AppFabric Cache Server API, it also now supports the Memcached protocol – allowing you to point code written against Memcached at it (no code changes required). The new distributed cache can be setup to run in one of two ways: 1) Using a co-located approach.  In this option you allocate a percentage of memory in your existing web and worker roles to be used by the cache, and then the cache joins the memory into one large distributed cache.  Any data put into the cache by one role instance can be accessed by other role instances in your application – regardless of whether the cached data is stored on it or another role.  The big benefit with the “co-located” option is that it is free (you don’t have to pay anything to enable it) and it allows you to use what might have been otherwise unused memory within your application VMs. 2) Alternatively, you can add “cache worker roles” to your cloud service that are used solely for caching.  These will also be joined into one large distributed cache ring that other roles within your application can access.  You can use these roles to cache 10s or 100s of GBs of data in-memory very effectively – and the cache can be elastically increased or decreased at runtime within your application: New SDKs and Tooling Support We have updated all of the Windows Azure SDKs with today’s release to include new features and capabilities.  Our SDKs are now available for multiple languages, and all of the source in them is published under an Apache 2 license and and maintained in GitHub repositories. The .NET SDK for Azure has in particular seen a bunch of great improvements with today’s release, and now includes tooling support for both VS 2010 and the VS 2012 RC. We are also now shipping Windows, Mac and Linux SDK downloads for languages that are offered on all of these systems – allowing developers to develop Windows Azure applications using any development operating system. Much, Much More The above is just a short list of some of the improvements that are shipping in either preview or final form today – there is a LOT more in today’s release.  These include new Virtual Private Networking capabilities, new Service Bus runtime and tooling support, the public preview of the new Azure Media Services, new Data Centers, significantly upgraded network and storage hardware, SQL Reporting Services, new Identity features, support within 40+ new countries and territories, and much, much more. You can learn more about Windows Azure and sign-up to try it for free at http://windowsazure.com.  You can also watch a live keynote I’m giving at 1pm June 7th (later today) where I’ll walk through all of the new features.  We will be opening up the new features I discussed above for public usage a few hours after the keynote concludes.  We are really excited to see the great applications you build with them. Hope this helps, Scott

    Read the article

  • Domain transfer and New Hosting Management

    - by Anubhav Saini
    I wanted to migrate from my older registrar to GoDaddy. Main reason because current registrar/hosting provider doesn't support .NET. My old registrar gave me control over the domain and hosting account. So, basically I have everything I would need. ( I know theory only ) I applied for Transfer of domain, bought a hosting package from GoDaddy and uploaded new web site. So, I am waiting for domain transfer and it tells me that I have to wait for 5-7 days for approval. Okay. But today, my old registrar told/taunted me that I really didn't need to apply for transfer. What could possibly I have done differently? My domain expires on this 15th. Now I don't know much about how all of this really works, but I am guessing he meant, "you should have waited for 15 days and let it expire after which you should buy the domain as it is expired". Is it really so(I doubt) or there are some other ways I could have got same result but without transferring domain? (like, changing DNS entries) I have read like all of the documentation available on namecheap/GoDaddy/Whois about domain transfers. But maybe because I am new to this it is all confusing to me. I would also like to know what to do with DNS settings after transfer succeeds. I want to kill the old website. So, what nameserver settings I need to change, new one or old one or both? I have old host+old domain registrar + old working site on one hand, on the other hand, new site + pending domain transfer + new DNS settings.

    Read the article

  • ArchBeat Link-o-Rama Top 10 for November 2, 2012

    - by Bob Rhubart
    ADF Mobile - Login Functionality | Andrejus Baranovskis "The new ADF Mobile approach with native deployment is cool when you want to access phone functionality (camera, email, sms and etc.), also when you want to build mobile applications with advanced UI, " reports Oracle ACE Director Andrejus Baranovskis. Big Data: Running out of Metric System | Andrew McAfee Do very large numbers make your brain hurt? Better stock up on aspirin. According to Andrew McAfee: "It seems safe to say that before the current decade is out we’ll need to convene a 20th conference to come up with some more prefixes for extraordinarily large quantities not to describe intergalactic distances or the amount of energy released by nuclear reactions, but to capture the amount of digital data in the world." Cloud computing will save us from the zombie apocalypse | Cloud Computing - InfoWorld "It's just a matter of time before we migrate our existing IT assets to public cloud systems," says InfoWorld cloud blogger David Linthicum. "Additionally, it's a short window until the dead rise from the grave and attempt to eat our brains." Is is Halloween or something? Thought for the Day "A computer lets you make more mistakes faster than any invention in human history—with the possible exceptions of hand guns and tequila." — Mitch Ratcliffe

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >