Search Results

Search found 284 results on 12 pages for 'trevor robinson'.

Page 11/12 | < Previous Page | 7 8 9 10 11 12  | Next Page >

  • Adding a web address to a <a href="value">

    - by Michael Robinson
    I have an upload script that write a index.html file, I modified it to mail me the results, however the point to the root, since the email isn't in the root, the image doesn't load. How do I append the code to add "http://www.home.html/uploaded" prior to the ". $value ." so that the images show up in the email. Here is portion of PHP that assigns the images to a $result: // Process value of Imagedata field, this is JPEG-files array foreach ($_FILES[Imagedata][name] as $key => $value) { $uploadfile = $destination_dir . basename($_FILES[Imagedata][name][$key]); if (move_uploaded_file($_FILES['Imagedata']['tmp_name'][$key], $uploadfile)) { $result .= "File uploaded: <a href='". $value . "'>" . $value . "</a><br>"; } } // $result .= "<br>"; Here is what I'm now receiving in the email, : <!doctype html public "-//w3c//dtd html 4.0 transitional//en"> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> <title>Upload results</title> </head> <body> AdditionalStringVariable: pass additional data here<br><a> href='gallery_grannylovesthis_l.jpg'><img border = '0' src='QIU_thumb_gallery_grannylovesthis_l.jpg'/></a><br><br><br>File uploaded: <a href='gallery_grannylovesthis_l.jpg'>gallery_grannylovesthis_l.jpg</a><br><br> GlobalControlData: PHOTO_VISIBILITY : 2<br> GlobalControlData: PHOTO_DESCR : Requiredtest<br> GlobalControlData: PHOTO_TITLE : Requiredtest<br><br>gallery_grannylovesthis_l.png<br> control: , value: <br> </body> </html> Thanks in advance for any guidance...I have a feeling it's something simple.

    Read the article

  • Loading a UIView from a UITableview

    - by Michael Robinson
    I can firgure out how to push a UIView from a Tableview and have the "child" details appear. Here is the view I'm trying to load: Here is the code that checks for children and either pushes a itemDetail.xib or an additional UITable, I want to use the above .xib but load the correct contents "tableDataSource" into the UItable: - (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath { //Get the dictionary of the selected data source. NSDictionary *dictionary = [self.tableDataSource objectAtIndex:indexPath.row]; //Get the children of the present item. NSArray *Children = [dictionary objectForKey:@"Children"]; if([Children count] == 0) { ItemDetailViewController *dvController = [[ItemDetailViewController alloc] initWithNibName:@"ItemDetailView" bundle:[NSBundle mainBundle]]; [self.navigationController pushViewController:dvController animated:YES]; [dvController release]; } else { //Prepare to tableview. FirstTab *rvController = [[FirstTab alloc] initWithNibName:@"FirstView" bundle:[NSBundle mainBundle]]; //Increment the Current View rvController.CurrentLevel += 1; //Set the title; rvController.CurrentTitle = [dictionary objectForKey:@"Title"]; //Push the new table view on the stack [self.navigationController pushViewController:rvController animated:YES]; rvController.tableDataSource = Children; [rvController release]; } } Thanks for the help. I see lots of stuff on this but can't find the correct push instructions.

    Read the article

  • MediaWiki: workaround for watchlist restricted to 7 days?

    - by Mark Robinson
    I'm running MediaWiki 1.13.2. When I go to my watchlist, I'm limited to only viewing the last 7 days (though MediaWiki should permit me to view the last 30 days). I've tried: Clicking the all button Changing the URL Changing the settings under MY PREFERENCES-WATCHLIST. When I save (e.g. 30 days) it resets back to 7 days! Googling for an answer... Anyone heard of this bug before? Is there a workaround? I can't see a setting anywhere for this to be a maximum.

    Read the article

  • How can I visually format JSON data (programmatically)?

    - by Ian Robinson
    I'm working with big blobs of JSON. These blobs change slightly over time and a revision history is kept. I'd really like to be able to do a visual diff on them, but my problem is they're being stored without any formatting at all - everything is on one line, so that makes it a little hard to see what changed. Is there a good way to programatically format them ala http://jsonformat.com/ or http://jsonformatter.curiousconcept.com/?

    Read the article

  • Memory increases with Java UDP Server

    - by Trevor
    I have a simple UDP server that creates a new thread for processing incoming data. While testing it by sending about 100 packets/second I notice that it's memory usage continues to increase. Is there any leak evident from my code below? Here is the code for the server. public class UDPServer { public static void main(String[] args) { UDPServer server = new UDPServer(15001); server.start(); } private int port; public UDPServer(int port) { this.port = port; } public void start() { try { DatagramSocket ss = new DatagramSocket(this.port); while(true) { byte[] data = new byte[1412]; DatagramPacket receivePacket = new DatagramPacket(data, data.length); ss.receive(receivePacket); new DataHandler(receivePacket.getData()).start(); } } catch (IOException e) { e.printStackTrace(); } } } Here is the code for the new thread that processes the data. For now, the run() method doesn't do anything. public class DataHandler extends Thread { private byte[] data; public DataHandler(byte[] data) { this.data = data; } @Override public void run() { System.out.println("run"); } }

    Read the article

  • Rails modeling for a user

    - by Trevor Hartman
    When building a rails app that allows a User to login and create data, is it best to setup a belongs_to :user association on every single model? For example, let's say a user can create Favorites, Colors and Tags. And let's say Favorites has_many :tags and Colors also has_many :tags. Is it still important for Tags to belong_to :user assuming the User is the only person who has authority to edit those tags? And a similar question along the same lines: When updating data in FavoritesController, I've come to the conclusion that you perform CRUD operations by always doing something like User.favorites.find(params[:id].update_attributes(param[:favorite]) so that they can definitely only update models that belong to them. Right?

    Read the article

  • Does setting HttpCacheability.Public also cache the page on the server?

    - by Stewart Robinson
    I have these lines in my global.asax (basically because of http://stackoverflow.com/questions/2469348/can-i-add-my-caching-lines-to-global-asax) The thing I want to now understand is whether this code purely adds the HTTP headers to the page or does it also make .Net cache this page on the server for 300 seconds? Response.Cache.SetExpires(DateTime.Now.AddSeconds(300)); Response.Cache.SetCacheability(HttpCacheability.Public);

    Read the article

  • php/mySQL error: mysql_num_rows(): supplied argument is not a valid MySQL result

    - by Michael Robinson
    I'm trying to INSERT INTO a mySQL database and I'm getting this error on: if (mysql_num_rows($login) == 1){ Here is the php, The php does add the user to the database. I can't figure it out. <? session_start(); require("config.php"); $u = $_GET['username']; $pw = $_GET['password']; $pwh = $_GET['passwordhint']; $em = $_GET['email']; $zc = $_GET['zipcode']; $check = "INSERT INTO family (loginName, email, password, passwordhint, location) VALUES ('$u', '$pw', '$pwh', '$em', '$zc')"; $login = mysql_query($check, $link) or die(mysql_error()); if (mysql_num_rows($login) == 1) { $row = mysql_fetch_assoc($login); echo 'Yes';exit; } else { echo 'No';exit; } mysql_close($link); ?> Thanks,

    Read the article

  • Why won't my code segfault on Windows 7?

    - by Trevor
    This is an unusual question to ask but here goes: In my code, I accidentally dereference NULL somewhere. But instead of the application crashing with a segfault, it seems to stop execution of the current function and just return control back to the UI. This makes debugging difficult because I would normally like to be alerted to the crash so I can attach a debugger. What could be causing this? Specifically, my code is an ODBC Driver (ie. a DLL). My test application is ODBC Test (odbct32w.exe) which allows me to explicitly call the ODBC API functions in my DLL. When I call one of the functions which has a known segfault, instead of crashing the application, ODBC Test simply returns control to the UI without printing the result of the function call. I can then call any function in my driver again. I do know that technically the application calls the ODBC driver manager which loads and calls the functions in my driver. But that is beside the point as my segfault (or whatever is happening) causes the driver manager function to not return either (as evidenced by the application not printing a result). One of my co-workers with a similar machine experiences this same problem while another does not but we have not been able to determine any specific differences.

    Read the article

  • Getting error: CS1061

    - by coure06
    Refer to http://stackoverflow.com/questions/369794/good-and-full-implementation-of-rss-feeds-in-asp-net-mvc Check the answer of Trevor de Koekkoek. I am getting this error CS1061: 'object' does not contain a definition for 'Items' and no extension method 'Items' accepting a first argument of type 'object' could be found (are you missing a using directive or an assembly reference?)

    Read the article

  • Microsoft Codename Houston

    - by kaleidoscope
    On one of the final talks about SQL Azure in Day 3 of PDC09, David Robinson, Senior PM on the Azure team, announced a project codenamed ‘Houston’ which is basically a Silverlight equivalent of SQL Server Management Studio. The concept comes from the SQL Azure being within the cloud, and if the only way to interact with it is by installing SSMS locally then it does not feel like a consistent story. From the limited preview, it only contains the basics but it clearly lets you create tables, stored procedures and views, edit them, even add data to tables in a grid view reminiscent of Microsoft Access. The UI was based around the standard ribbon bar, object window on the left and working pane on the right. As of now this tool is still pre-alpha and it seems like a basic tool that will facilitate rapid database development on cloud. When asked about general availability, no dates were given but calendar 2010 was indicated as the target. More information can be found at:      http://sqlfascination.com/2009/11/20/pdc-09-day-3-sql-azure-and-codename-houston-announcement/   Tinu, O

    Read the article

  • Five Key Strategies in Master Data Management

    - by david.butler(at)oracle.com
    Here is a very interesting Profit Magazine article on MDM: A recent customer survey reveals the deleterious effects of data fragmentation. by Trevor Naidoo, December 2010   Across industries and geographies, IT organizations have grown in complexity, whether due to mergers and acquisitions, or decentralized systems supporting functional or departmental requirements. With systems architected over time to support unique, one-off process needs, they are becoming costly to maintain, and the Internet has only further added to the complexity. Data fragmentation has become a key inhibitor in delivering flexible, user-friendly systems. The Oracle Insight team conducted a survey assessing customers' master data management (MDM) capabilities over the past two years to get a sense of where they are in terms of their capabilities. The responses, by 27 respondents from six different industries, reveal five key areas in which customers need to improve their data management in order to get better financial results. 1. Less than 15 percent of organizations surveyed understand the sources and quality of their master data, and have a roadmap to address missing data domains. Examples of the types of master data domains referred to are customer, supplier, product, financial and site. Many organizations have multiple sources of master data with varying degrees of data quality in each source -- customer data stored in the customer relationship management system is inconsistent with customer data stored in the order management system. Imagine not knowing how many places you stored your customer information, and whether a customer's address was the most up to date in each source. In fact, more than 55 percent of the respondents in the survey manage their data quality on an ad-hoc basis. It is important for organizations to document their inventory of data sources and then profile these data sources to ensure that there is a consistent definition of key data entities throughout the organization. Some questions to ask are: How do we define a customer? What is a product? How do we define a site? The goal is to strive for one common repository for master data that acts as a cross reference for all other sources and ensures consistent, high-quality master data throughout the organization. 2. Only 18 percent of respondents have an enterprise data management strategy to ensure that data is treated as an asset to the organization. Most respondents handle data at the department or functional level and do not have an enterprise view of their master data. The sales department may track all their interactions with customers as they move through the sales cycle, the service department is tracking their interactions with the same customers independently, and the finance department also has a different perspective on the same customer. The salesperson may not be aware that the customer she is trying to sell to is experiencing issues with existing products purchased, or that the customer is behind on previous invoices. The lack of a data strategy makes it difficult for business users to turn data into information via reports. Without the key building blocks in place, it is difficult to create key linkages between customer, product, site, supplier and financial data. These linkages make it possible to understand patterns. A well-defined data management strategy is aligned to the business strategy and helps create the governance needed to ensure that data stewardship is in place and data integrity is intact. 3. Almost 60 percent of respondents have no strategy to integrate data across operational applications. Many respondents have several disparate sources of data with no strategy to keep them in sync with each other. Even though there is no clear strategy to integrate the data (see #2 above), the data needs to be synced and cross-referenced to keep the business processes running. About 55 percent of respondents said they perform this integration on an ad hoc basis, and in many cases, it is done manually with the help of Microsoft Excel spreadsheets. For example, a salesperson needs a report on global sales for a specific product, but the product has different product numbers in different countries. Typically, an analyst will pull all the data into Excel, manually create a cross reference for that product, and then aggregate the sales. The exact same procedure has to be followed if the same report is needed the following month. A well-defined consolidation strategy will ensure that a central cross-reference is maintained with updates in any one application being propagated to all the other systems, so that data is synchronized and up to date. This can be done in real time or in batch mode using integration technology. 4. Approximately 50 percent of respondents spend manual efforts cleansing and normalizing data. Information stored in various systems usually follows different standards and formats, making it difficult to match the data. A customer's address can be stored in different ways using a variety of abbreviations -- for example, "av" or "ave" for avenue. Similarly, a product's attributes can be stored in a number of different ways; for example, a size attribute can be stored in inches and can also be entered as "'' ". These types of variations make it difficult to match up data from different sources. Today, most customers rely on manual, heroic efforts to match, cleanse, and de-duplicate data -- clearly not a scalable, sustainable model. To solve this challenge, organizations need the ability to standardize data for customers, products, sites, suppliers and financial accounts; however, less than 10 percent of respondents have technology in place to automatically resolve duplicates. It is no wonder, therefore, that we get communications about products we don't own, at addresses we don't reside, and using channels (like direct mail) we don't like. An all-too-common example of a potential challenge follows: Customers end up receiving duplicate communications, which not only impacts customer satisfaction, but also incurs additional mailing costs. Cleansing, normalizing, and standardizing data will help address most of these issues. 5. Only 10 percent of respondents have the ability to share data that was mastered in a master data hub. Close to 60 percent of respondents have efforts in place that profile, standardize and cleanse data manually, and the output of these efforts are stored in spreadsheets in various parts of the organization. This valuable information is not easily shared with the rest of the organization and, more importantly, this enriched information cannot be sent back to the source systems so that the data is fixed at the source. A key benefit of a master data management strategy is not only to clean the data, but to also share the data back to the source systems as well as other systems that need the information. Aside from the source systems, another key beneficiary of this data is the business intelligence system. Having clean master data as input to business intelligence systems provides more accurate and enhanced reporting.  Characteristics of Stellar MDM When deciding on the right master data management technology, organizations should look for solutions that have four main characteristics: enterprise-grade MDM performance complete technology that can be rapidly deployed and addresses multiple business issues end-to-end MDM process management with data quality monitoring and assurance pre-built MDM business relevant applications with data stores and workflows These master data management capabilities will aid in moving closer to a best-practice maturity level, delivering tremendous efficiencies and savings as well as revenue growth opportunities as a result of better understanding your customers.  Trevor Naidoo is a senior director in Industry Strategy and Insight at Oracle. 

    Read the article

  • Slides and Pictures from PowerShell Saturday Columbus 2012

    - by Brian Jackett
    On March 10th, 2012 the first ever PowerShell Saturday conference took place in Columbus, OH and I couldn’t be happier with the outcome.  We had 100 attendees from 10 different states (the biggest surprise to me) come to see 6 speakers present on a variety of PowerShell topics: introduction, WMI, SharePoint, Active Directory, Exchange, 3rd party products and more.      A big thank you also goes out to a number of people. Planning committee Wes Stahler, lead organizer of PowerShell Saturday Columbus, president of Central Ohio PowerShell User Group Ed “Microsoft Scripting Guy” Wilson Teresa “The Scripting Wife” Wilson Ashley McGlone Brian T. Jackett (myself) Speakers Ed Wilson Ashley McGlone James Brundage Trevor Sullivon Daniel Cruz Volunteer Lisa Gardner, fellow Microsoft PFE volunteered her time on a Saturday to assist with smooth operation of the day Facility Coordination Debbie Carrier, facilities coordinator for the Columbus Microsoft Office and helped us out greatly with the venue   Slides and Script Samples    I presented my session on “PowerShell for the SharePoint 2010 Developer”.  Below you can download the slides and script samples.   Photos    I wasn’t able to take took many pictures (only 3) as I was busy doing my presentation, answering questions, and taking care of random items throughout the day.   Pictures on Facebook    click here Pictures on SkyDrive (higher res) PowerShell Saturday Columbus Mar '12 VIEW SLIDE SHOW DOWNLOAD ALL   Conclusion    I’m very happy that this first ever PowerShell Saturday was a success.  My fellow PFE and speaker Ashley McGlone also has a short write-up on his blog about the event (click here).  I have heard rumors that there are other cities starting to plan their own local events.  When I hear more details I’ll spread the word here and on Twitter.         -Frog Out

    Read the article

  • fast opening and closing connection with a specific port

    - by michale
    We have a Main application named "Trevor" installed in 2008R2 machine named "TEAMER12" which is slow now. One more application named "TVS" also running in and found there were many connections per second occurring to port 5009. netstat tool mentions that some fast connection open/close seen for port 5009 So first it will be listening mode like shown below TCP 0.0.0.0:5009 TEAMER12:0 LISTENING then establishes connection like TCP 127.0.0.1:5009 TEAMER12:49519 ESTABLISHED TCP 127.0.0.1:5009 TEAMER12:60903 ESTABLISHED After that iwill become TIME_WAIT and i could see several entries like shown below TCP 127.0.0.1:49156 TEAMER12:5009 TIME_WAIT after that it will establish connection like TCP 127.0.0.1:60903 TEAMER12:5009 ESTABLISHED TCP 127.0.0.1:64181 TEAMER12:microsoft-ds ESTABLISHED again it will go several entries like TIME_WAIT TCP 127.0.0.1:49156 TEAMER12:5009 TIME_WAIT Finally it will establish like this TCP 172.26.127.40:139 TEAMER12:0 LISTENING TCP 172.26.127.42:139 TEAMER12:0 LISTENING TCP 172.26.127.42:5009 TEAMER12:64445 ESTABLISHED TCP 172.26.127.42:64445 TEAMER12:5009 ESTABLISHED Can any body tell me whats the reason behind why many connections per second occurring to port 5009 and why application slow?

    Read the article

  • opening and closing connection with port happening fastly

    - by michale
    We have a Main application named "Trevor" installed in 2008R2 machine named "TEAMER12" which is slow now. One more application named "TVS" also running in and found there were many connections per second occurring to port 5009. netstat tool mentions that some fast connection open/close seen for port 5009 So first it will be listening mode like shown below TCP 0.0.0.0:5009 TEAMER12:0 LISTENING then establishes connection like TCP 127.0.0.1:5009 TEAMER12:49519 ESTABLISHED TCP 127.0.0.1:5009 TEAMER12:60903 ESTABLISHED After that iwill become TIME_WAIT and i could see several entries like shown below TCP 127.0.0.1:49156 TEAMER12:5009 TIME_WAIT after that it will establish connection like TCP 127.0.0.1:60903 TEAMER12:5009 ESTABLISHED TCP 127.0.0.1:64181 TEAMER12:microsoft-ds ESTABLISHED again it will go several entries like TIME_WAIT TCP 127.0.0.1:49156 TEAMER12:5009 TIME_WAIT Finally it will establish like this TCP 172.26.127.40:139 TEAMER12:0 LISTENING TCP 172.26.127.42:139 TEAMER12:0 LISTENING TCP 172.26.127.42:5009 TEAMER12:64445 ESTABLISHED TCP 172.26.127.42:64445 TEAMER12:5009 ESTABLISHED Can any body tell me whats the reason behind why many connections per second occurring to port 5009 and why application slow?

    Read the article

  • Caption Competition 2: The Captioning

    - by Simple-Talk Editorial Team
    Caption competition time again! What’s going on here then?   Some suggestions to get your comedy juices flowing: “So long chaps, hope you can continue to cope without a written disaster plan!” – said the only DBA “These shoes cost a lot of money, I’m not muddying them in the SAN Admin waters!” “Down Devs, down. Stay away from my database.” It had taken a lot of time and work, but finally Trevor’s out of office setup had the sense of occasion he needed. “Could you just add one small feature?” shouted upper management, hurtling by. Add your suggestion in the comments for a chance to win $50 in Amazon vouchers. Anything computer-related will help, but feel free to suggest anything. The competition runs until 5 p.m. (BST) on Friday the 16th of May.

    Read the article

  • SyncToBlog #10 Lots of Azure and Cloud Links including MIX10 videos

    - by Eric Nelson
    Just getting a few interesting cloud links “down on paper”. I last did one of these on Azure in Feb 20010. Cloud Links: Article on Debugging in the Cloud http://code.msdn.microsoft.com/azurescale  A sample app that demonstrates monitoring and automatically scaling an Azure application in response to dropping performance etc. Basically a console app that checks perf stats and then uses the Service Management API to spin up new instances when needed. Azure In Action book is imminent :) Running Memcached in Windows Azure from the MS UK team Using Microsoft Codename Dallas as a data source for Drupal also from the MS UK team I often mention them – but this post is the biz! Metodi on fault and upgrade domains Detailed blog post on comparing Azure AppFabric Service Bus REST support to the free Faye Ruby+JavaScript gem that implements the JSON publish/subscribe protocol Bayeux. AppFabric LABS allow you to test out and play with experimental AppFabric technologies. Details of the upcoming VM support in Windows Azure Nice series of posts from J D Meier in the Patterns and Practice team How To Use ASP.NET Forms Auth with Azure Tables  How To Use ASP.NET Forms Auth with Roles in Azure Tables How To Use ASP.NET Forms Auth with SQL Server on Windows Azure And sessions from MIX10 held March 15th to 17th: Lap around the Windows Azure Platform – Steve Marx Building and Deploying Windows Azure Based Applications with Microsoft Visual Studio 2010 – Jim Nakashima Building PHP Applications using the Windows Azure Platform – Craig Kitterman, Sumit Chawla Using Ruby on Rails to Build Windows Azure Applications – Sriram Krishnan Microsoft Project Code Name “Dallas": Data for your apps – Moe Khosravy Using Storage in the Windows Azure Platform – Chris Auld Building Web Applications with Windows Azure Storage – Brad Calder Building Web Application with Microsoft SQL Azure – David Robinson Connecting Your Applications in the Cloud with Windows Azure AppFabric – Clemens Vasters Microsoft Silverlight and Windows Azure: A Match Made for the Web – Matt Kerner Something for everyone :)

    Read the article

  • Where is the SQL Azure Development Environment

    - by BuckWoody
    Recently I posted an entry explaining that you can develop in Windows Azure without having to connect to the main service on the Internet, using the Software Development Kit (SDK) which installs two emulators - one for compute and the other for storage. That brought up the question of the same kind of thing for SQL Azure. The short answer is that there isn’t one. While we’ll make the development experience for all versions of SQL Server, including SQL Azure more easy to write against, you can simply treat it as another edition of SQL Server. For instance, many of us use the SQL Server Developer Edition - which in versions up to 2008 is actually the Enterprise Edition - to develop our code. We might write that code against all kinds of environments, from SQL Express through Enterprise Edition. We know which features work on a certain edition, what T-SQL it supports and so on, and develop accordingly. We then test on the actual platform to ensure the code runs as expected. You can simply fold SQL Azure into that same development process. When you’re ready to deploy, if you’re using SQL Server Management Studio 2008 R2 or higher, you can script out the database when you’re done as a SQL Azure script (with change notifications where needed) by selecting the right “Engine Type” on the scripting panel: (Thanks to David Robinson for pointing this out and my co-worker Rick Shahid for the screen-shot - saved me firing up a VM this morning!) Will all this change? Will SSMS, “Data Dude” and other tools change to include SQL Azure? Well, I don’t have a specific roadmap for those tools, but we’re making big investments on Windows Azure and SQL Azure, so I can say that as time goes on, it will get easier. For now, make sure you know what features are and are not included in SQL Azure, and what T-SQL is supported. Here are a couple of references to help: General Guidelines and Limitations: http://msdn.microsoft.com/en-us/library/ee336245.aspx Transact-SQL Supported by SQL Azure: http://msdn.microsoft.com/en-us/library/ee336250.aspx SQL Azure Learning Plan: http://blogs.msdn.com/b/buckwoody/archive/2010/12/13/windows-azure-learning-plan-sql-azure.aspx

    Read the article

  • Where is the SQL Azure Development Environment

    - by BuckWoody
    Recently I posted an entry explaining that you can develop in Windows Azure without having to connect to the main service on the Internet, using the Software Development Kit (SDK) which installs two emulators - one for compute and the other for storage. That brought up the question of the same kind of thing for SQL Azure. The short answer is that there isn’t one. While we’ll make the development experience for all versions of SQL Server, including SQL Azure more easy to write against, you can simply treat it as another edition of SQL Server. For instance, many of us use the SQL Server Developer Edition - which in versions up to 2008 is actually the Enterprise Edition - to develop our code. We might write that code against all kinds of environments, from SQL Express through Enterprise Edition. We know which features work on a certain edition, what T-SQL it supports and so on, and develop accordingly. We then test on the actual platform to ensure the code runs as expected. You can simply fold SQL Azure into that same development process. When you’re ready to deploy, if you’re using SQL Server Management Studio 2008 R2 or higher, you can script out the database when you’re done as a SQL Azure script (with change notifications where needed) by selecting the right “Engine Type” on the scripting panel: (Thanks to David Robinson for pointing this out and my co-worker Rick Shahid for the screen-shot - saved me firing up a VM this morning!) Will all this change? Will SSMS, “Data Dude” and other tools change to include SQL Azure? Well, I don’t have a specific roadmap for those tools, but we’re making big investments on Windows Azure and SQL Azure, so I can say that as time goes on, it will get easier. For now, make sure you know what features are and are not included in SQL Azure, and what T-SQL is supported. Here are a couple of references to help: General Guidelines and Limitations: http://msdn.microsoft.com/en-us/library/ee336245.aspx Transact-SQL Supported by SQL Azure: http://msdn.microsoft.com/en-us/library/ee336250.aspx SQL Azure Learning Plan: http://blogs.msdn.com/b/buckwoody/archive/2010/12/13/windows-azure-learning-plan-sql-azure.aspx

    Read the article

  • Follow the How-To Geek Writers on Twitter

    - by The Geek
    Ever wonder what the How-To Geek writers are up to? If you’re a Twitter user, you can connect with us directly. We’ve also setup a new @howtogeeknews account if you just want to keep up with the latest articles. So if you want just the latest articles… click the image below and then click the Follow button. Otherwise, if you’d like to connect with the rest of us that actually use Twitter, you can follow each of us separately through  the links below. Note: Let’s try to stick to discussion, and leave the tech support questions for our forum. the How-To Geek (that’s me!) -  @howtogeek Matthew Guay – @maguay Trevor Bekolay – @TrevorBekolay Asian Angel – @asian_angel  Andrew Gehman – @andrewgehman Some of the HTG writers are not currently using Twitter… but I’m gonna list their accounts just in case you wanted to follow them. Mark Virtue – @markvirtue Mysticgeek – @mysticgeek  (He’s far too productive to waste time on Twitter!) Enjoy the conversation! Similar Articles Productive Geek Tips Got Awesome Geek Skills? The How-To Geek is Looking for WritersGot Awesome Skills? Why Not Write for How-To Geek?Integrate Twitter With Microsoft OutlookState of the Geek 2009: Behind the Scenes and Other GeekeryAnnouncing the How-To Geek Blogs TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Download Videos from Hulu Pixels invade Manhattan Convert PDF files to ePub to read on your iPad Hide Your Confidential Files Inside Images Get Wildlife Photography Tips at BBC’s PhotoMasterClasses Mashpedia is a Real-time Encyclopedia

    Read the article

  • ArchBeat Link-o-Rama for 2012-09-07

    - by Bob Rhubart
    Oracle Technology Network Architect Day - Boston, MA - 9/12/2012 Sure, you could ask a voodoo priestess for help in improving your solution architecture skills. But there's the whole snake thing, and the zombie thing, and other complications. So why not keep it simple and register for Oracle Technology Network Architect Day in Boston, MA. There's no magic, just a full day of technical sessions covering Cloud, SOA, Engineered Systems, and more. Registration is free. Wednesday September 12, 2012 8:00 a.m. – 5:00 p.m. Boston Marriott Burlington, One Burlington Mall Road, Burlington, MA 01803 Attend OTN Architect Day in Los Angeles – by Architects, for Architects – October 25 The OTN Architect Day roadshow stops in Boston next week, then it's on to Los Angeles for another all architecture, all day event on Thursday October 25, 2012 at the Sofitel Los Angeles, 555 Beverly Boulevard, Los Angeles, CA 90048. Like all Architect Day events, this one is absolutley free, so register now. Webcast: Oracle WebCenter in Action: Hitachi Data Systems Catch this live webcast on Thursday, September 13, 2012 (10am PT / 1pm ET) to learn from speakers from Hitachi Data Systems, LingoTek, and Oracle about how Hitachi used Oracle WebCenter to improve the web experience for its international customers. Article Index: Architect Community Column in Oracle Magazine Did you know that Oracle Magazine features a regular column devoted specifically to the architect community? Every column includes insight and expertise from architects who regularly deal with the issues architects face. Click here to see a complete list of articles. ADF EMG Sunday at OOW 2012 (30. Sep 2012) - A day full of content | Frank Nimphius Frank Nimphius's shares details on Chris Muir's ADF EMG series of sessions during User Group Sunday at OOW, Sept 30, in Moscone West room 305. The Role of Oracle VM Server for SPARC in a Virtualization Strategy New OTN article from Matthias Pfützner. Countdown to Oracle OpenWorld 2012 | Oracle WebCenter Blog A helpful list of OOW sessions focused on Oracle WebCenter. Oracle Exalogic X2-2 walkthrough | Jan van Zoggel "For those of us not lucky enough to have one at home," Jan van Zoggel recommends this "very cool" video featuring "a detailed walkthrough explaining each component of a Oracle Exalogic X2-2 machine," presented by Oracle Exalogic VP Development Brad Cameron. September OTN Member Offers | OTN Blog Save big on books from top tech publishers with these discounts for OTN members. Thought for the Day "Only Robinson Crusoe had everything done by Friday." — Unknown Source: Quote Garden

    Read the article

  • Wireframing: A Day In the Life of UX Workshop at Oracle

    - by ultan o'broin
    The Oracle Applications User Experience team's Day in the Life (DITL) of User Experience (UX) event was run in Oracle's Redwood Shores HQ for Oracle Usability Advisory Board (OUAB) members. I was charged with putting together a wireframing session, together with Director of Financial Applications User Experience, Scott Robinson (@scottrobinson). Example of stunning new wireframing visuals we used on the DITL events. We put on a lively show, explaining the basics of wireframing, the concepts, what it is and isn't, considerations on wireframing tool choice, and then imparting some tips and best practices. But the real energy came when the OUAB customers and partners in the room were challenge to do some wireframing of their own. Wireframing is about bringing your business and product use cases to life in real UX visual terms, by creating a low-fidelity drawing to iterate and agree on in advance of prototyping and coding what is to be finally built and rolled out for users. All the best people wireframe. Leonardo da Vinci used "cartoons" on some great works, tracing outlines first and using red ochre or charcoal dropped through holes in the tracing parchment onto the canvas to outline the subject. (Image distributed under Wikimedia commons license) Wireframing an application's user experience design enables you to: Obtain stakeholder buy-in. Enable faster iteration of different designs. Determine the task flow navigation paths (in Oracle Fusion Applications navigation is linked with user roles). Develop a content strategy (readability, search engine optimization (SEO) of content, and so on) Lay out the pages, widgets, groups of features, and so on. Apply usability heuristics early (no replacement for usability testing, but a great way to do some heavy-lifting up front). Decide upstream which functional user experience design patterns to apply (out of the box solutions that expedite productivity). Assess which Oracle Application Development Framework (ADF) or equivalent technology components can be used (again, developer productivity is enhanced downstream). We ran a lively hands-on exercise where teams wireframed a choice of application scenarios using the time-honored tools of pen and paper. Scott worked the floor like a pro, pointing out great use of features, best practices, innovations, and making sure that the whole concept of wireframing, the gestalt, transferred. "We need more buttons!" The cry of the energized. Not quite. The winning wireframe session (online shopping scenario) from the Applications UX DITL event shown. Great fun, great energy, and great teamwork were evident in the room. Naturally, there were prizes for the best wireframe. Well, actually, prizes were handed out to the other attendees too! An exciting, slightly different aspect to delivery of this session made the wireframing event one of the highlights of the day. And definitely, something we will repeat again when we get the chance. Thanks to everyone who attended, contributed, and helped organize.

    Read the article

  • Netgear FVS336G: appropriate solution for today's small businesses?

    - by bwerks
    Hey all, I've been looking into a routers to facilitate a vpn solution for a small business. While the Netgear FVS336G looks good on paper, it appears to have some fairly crippling setbacks that drag down what appears to be some great hardware. First off, the unit has been around for a couple years now, perhaps before 64-bit operating systems were as common as they are now, and complaints are everywhere that claim that SSL or IPsec (or both) VPN connections will not work with 64-bit operating systems. However, most of these claims mention only Vista, which makes me think that these problems could have potentially been solved since then. Unfortunately though, Netgear's support forums seem to be incredibly private, and policed by some troll named jmizuguchi who just closes down public posts in order to marshal them into the private ones. Danger, will robinson. Apparently their firmware upgrade process is a nightmare too, but that's beside the point. My question is this: has anyone configured one a Netgear FVS336G to operate in a server 2008 (or R2)/windows 7 64-bit network? If so, is it possible to use the microsoft vpn client or are third party clients still required? If this thing has just failed the test of time, is there a feature-comparable unit that I've missed, at anywhere near the same price range? Thanks!

    Read the article

  • F1 Pit Pragmatics

    - by mikef
    "I hate computers. No, really, I hate them. I love the communications they facilitate, I love the conveniences they provide to my life. but I actually hate the computers themselves." - Scott Merrill, 'I hate computers: confessions of a Sysadmin' If Scott's goal was to polarize opinion and trigger raging arguments over the 'real reasons why computers suck', then he certainly succeeded. Impassioned vitriol sits side-by-side with rational debate. Yet Scott's fundamental point is absolutely on the money - Computers are a means to an end. The IT industry is finally starting to put weight behind the notion that good User Experience is an absolutely crucial goal, a cause championed by the likes of Microsoft's Bill Buxton, and which Apple's increasingly ubiquitous touch screen interface exemplifies. However, that doesn't change the fact that, occasionally, you just have to man up and deal with complex systems. In fact, sometimes you just need to sacrifice everything else in the name of performance. You'll find a perfect example of this Faustian bargain in Trevor Clarke's fascinating look into the (diabolical) IT infrastructure of modern F1 racing - high performance, high availability. high everything. To paraphrase, each car has up to 100 sensors, transmitting around 30Gb of data over the course of a race (70% in real-time). This data is then processed by no less than 3 servers (per car) so that the engineers in the pit have access to telemetry, strategy information, timing feeds, a connection back to the operations room in the team's home base - the list goes on. All of this while the servers are exposed "to carbon dust, oil, vibration, rain, heat, [and] variable power". Now, this is admittedly an extreme context where there's no real choice but to use complex systems where ease-of-use is, at best, a secondary concern. The flip-side is seen in small-scale personal computing such as that seen in Apple's iDevices, which are incredibly intuitive but limited in their scope. In terms of what kinds of systems they prefer to use, I suspect that most SysAdmins find themselves somewhere along this axis of Power vs. Usability, and which end of this axis you resonate with also hints at where you think the IT industry should focus its energy. Do you see yourself in the F1 pit, making split-second decisions, wrestling with information flows and reticent hardware to bend them to your will? If so, I imagine you feel that computers are subtle tools which need to be tuned and honed, using the advanced knowledge possessed only by responsible SysAdmins (If you have an iPhone, I suspect it's jail-broken). If the machines throw enigmatic errors, it's the price of flexibility and raw power. Alternatively, would you prefer to have your role more accessible, with users empowered by knowledge, spreading the load of managing IT environments? In that case, then you want hardware and software to have User Experience as their primary focus, and are of the "means to an end" school of thought (you're probably also fed up with users not listening to you when you try and help). At its heart, the dichotomy is between raw power (which might be difficult to use) and ease-of-use (which might have some limitations, but you can be up and running immediately). Of course, the ultimate goal is a fusion of flexibility, power and usability all in one system. It's achievable in specific software environments, and Red Gate considers it a target worth aiming for, but in other cases it's a goal right up there with cold fusion. I think it'll be a long time before we see it become ubiquitous. In the meantime, are you Power-Hungry or a Champion of Usability? Cheers, Michael Francis Simple Talk SysAdmin Editor

    Read the article

< Previous Page | 7 8 9 10 11 12  | Next Page >