Search Results

Search found 9929 results on 398 pages for 'azure tables'.

Page 18/398 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Once VPN connection is done, how do I proceed reaching the other side address space?

    - by sports
    I'm using Windows Azure and I created a VPN Site To Site, configured like this: My virtual network: My address space: 10.2.0.0/16 (65531) Subnet1: 10.2.1.0/24 (251) Subnet2: 10.2.2.0/24 (251) Gateway: 10.2.3.0/29 (3) My public gateway IP: 137.135.x.z (I wont show x and z for security reasons) This public gateway uses, as you can see, 5 IPs on subnet1 and 5 IPs on subnet2, and 5 IPs on the gateway "Their" virtual network (in azure this would be a "Local network") Their address space: 172.60.100.67/32, 172.60.100.68/32, 172.60.100.69/32 Their device public IP: x.x.x.x (ommited for security reasons) Notice their address space are 3 IPs So: the VPN is "in green" (in Azure is showing up green, literally, like these two are connected) and now my question is: How do I proceed to reach their address space? I've tried creating a virtual machine (Windows Server 2008, but it could be an Ubuntu) on "my" virtual network and it is automatically "placed" on subnet1 or subnet2. So it gets the IP 10.2.1.0 (valid example), I can't choose to place the virtual machine in the gateway address space. How do I "reach" any of the IPs 172.60.100.67, 172.60.100.68, 172.60.100.69 ? In other words: How can I telnet any of these IPs? or ping? or see them in my network? Please provide me answers for Windows Server 2008 or for an Ubuntu. I'm open to create any virtual machine.

    Read the article

  • Big AdventureWorks2012

    - by jamiet
    Last week I launched AdventureWorks on Azure, an initiative to make SQL Azure accessible to anyone, in my blog post AdventureWorks2012 now available for all on SQL Azure. Since then I think its fair to say that the reaction has been lukewarm with 31 insertions into the [dbo].[SqlFamily] table and only 8 donations via PayPal to support it; on the other hand those 8 donators have been incredibly generous and we nearly have enough in the bank to cover a full year’s worth of availability. It was always my intention to try and make this offering more appealing and to that end I have used an adapted version of Adam Machanic’s make_big_adventure.sql script to massively increase the amount of data in the database and give the community more scope to really push SQL Azure and see what it is capable of. There are now two new tables in the database: [dbo].[bigProduct] with 25200 rows [dbo].[bigTransactionHistory] with 7827579 rows The credentials to login and use AdventureWorks on Azure are as they were before: Server mhknbn2kdz.database.windows.net Database AdventureWorks2012 User sqlfamily Password sqlf@m1ly Remember, if you want to support AdventureWorks on Azure simply click here to launch a pre-populated PayPal Send Money form - all you have to do is login, fill in an amount, and click Send. We need more donations to keep this up and running so if you think this is useful and worth supporting, please please donate.   I mentioned that I had to adapt Adam’s script, the main reasons being: Cross-database queries are not yet supported in SQL Azure so I had to create a local copy of [dbo].[spt_values] rather than reference that in [master] SELECT…INTO is not supported in SQL Azure The 1GB limit of SQLAzure web edition meant that there would not be enough space to store all the data generated by Adam’s script so I had to decrease the total number of rows. The amended script is available on my SkyDrive at https://skydrive.live.com/redir.aspx?cid=550f681dad532637&resid=550F681DAD532637!16756&parid=550F681DAD532637!16755 @Jamiet

    Read the article

  • Squid on an Azure VM

    - by LantisGaius
    I can't get it to work. Here's exactly what I did: Create a new Azure VM, Windows Server 2012. RDP to the new VM Download & Extract Squid for Windows (2.7.STABLE8) Rename the conf files (squid, mime & cachemgr) Add the following lines on the end of squid.conf auth_param basic program c:/squid/libexec/ncsa_auth.exe c:/squid/etc/passwd.txt auth_param basic children 5 auth_param basic realm Welcome to http://abcde.fg Squid Proxy! auth_param basic credentialsttl 12 hours auth_param basic casesensitive off acl ncsa_users proxy_auth REQUIRED http_access allow ncsa_users Use http://www.htaccesstools.com/htpasswd-generator-windows/ to create passwd.txt Test passwd.txt via c:/squid/libexec/ncsa_auth.exe c:/squid/etc/passwd.txt (success) squid -z squid -i net start squid (No errors so far). go to https://manage.windowsazure.com, Virtual Machines - myVM - Endpoints Add Endpoint: Name: Squid Protocol: TCP Public Port: 80 Private Port: 3128 That's it. Unfortunately, it doesn't work. I think I screwed something up at the endpoint? I'm not sure.. help? EDIT: I'm testing it via Firefox - Options - Advanced - Network, and the exact error is "The Proxy Server is refusing connections." I'm using my DNS as the Proxy server "abcdef.cloudapp.net" and port 80 (since that's my public endpoint).

    Read the article

  • Azure load-balancing strategy

    - by growse
    I'm currently building out a small web deployment using VM instances on MS Azure. The main problem I'm facing at the moment is trying to figure out how to get the load-balancing to detect if a particular VM has failed and not route traffic to that VM. As far as I can tell, there are only only two load-balancing options: Have multiple VMs (web01, web02, web03 etc.) within the same 'cloud service' behind a single VIP, and configure the endpoints to be load balanced. Create multiple 'cloud services', put a single web VM in each and create a traffic manager service across all these services. It appears that (1) is extremely simplistic and doesn't attempt to do any host failure detection. (2) appears to be much more varied, but requires me to put all my webservers in their own individual cloud service. Traffic manager appears to be much more directed at a geographic failover scenario, where you have multiple cloud services across different regions. This approach also has the disadvantage in that my web servers won't be able to communicate with my databases on internal IP addresses, unlike scenario (1). What's the best approach here?

    Read the article

  • Azure's Ubuntu 12.0.4 fails to install PHP5

    - by Alex Kennberg
    Similar to this article from Azure themselves: http://www.windowsazure.com/en-us/manage/linux/common-tasks/install-lamp-stack/ I am trying to install PHP5 on Ubuntu 12.0.4 virtual machine. However, it fails installing the ssl-cert. $ sudo apt-get install php5 Reading package lists... Done Building dependency tree Reading state information... Done php5 is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 49 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? y Setting up ssl-cert (1.0.28) ... Could not create certificate. Openssl output was: Generating a 2048 bit RSA private key ............................+++ ...................................................................................................................+++ writing new private key to '/etc/ssl/private/ssl-cert-snakeoil.key' ----- problems making Certificate Request 140320238503584:error:0D07A097:asn1 encoding routines:ASN1_mbstring_ncopy:string too long:a_mbstr.c:154:maxsize=64 dpkg: error processing ssl-cert (--configure): subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: ssl-cert E: Sub-process /usr/bin/dpkg returned an error code (1) Any tips appreciated.

    Read the article

  • Q&amp;A: Can you develop for the Windows Azure Platform using Windows XP?

    - by Eric Nelson
    This question has come up several times recently as we take several hundred UK developers through 6 Weeks of Windows Azure training (sorry – we are full). Short answer: In the main, yes Longer answer: The question is sparked by the requirements as stated on the Windows Azure SDK download page. Namely: Supported Operating Systems: Windows 7; Windows Vista; Windows Vista 64-bit Editions Service Pack 1; Windows Vista Business; Windows Vista Business 64-bit edition; Windows Vista Enterprise; Windows Vista Enterprise 64-bit edition; Windows Vista Home Premium; Windows Vista Home Premium 64-bit edition; Windows Vista Service Pack 1; Windows Vista Service Pack 2; Windows Vista Ultimate; Windows Vista Ultimate 64-bit edition Notice there is no mention of Windows XP. However things are not quite that simple. The Windows Azure Platform consists of three released technologies Windows Azure SQL Azure Windows Azure platform AppFabric The Windows Azure SDK is only for one of the three technologies, Windows Azure. What about SQL Azure and AppFabric? Well it turns out that you can develop for both of these technologies just fine with Windows XP: SQL Azure development is really just SQL Server development with a few gotchas – and for local development you can simply use SQL Server 2008 R2 Express (other versions will also work). AppFabric also has no local simulation environment and the SDK will install fine on Windows XP (SDK download) Actually it is also possible to do Windows Azure development on Windows XP if you are willing to always work directly against the real Azure cloud running in Microsoft datacentres. However in practice this would be painful and time consuming, hence why the Windows Azure SDK installs a local simulation environment. Therefore if you want to develop for Windows Azure I would recommend you either upgrade from Windows XP to Windows 7 or… you use a virtual machine running Windows 7. If this is a temporary requirement, then you could consider building a virtual machine using the Windows 7 Enterprise 90 day eval. Or you could download a pre-configured VHD – but I can’t quite find the link for a Windows 7 VHD. Pointers welcomed. Thanks.

    Read the article

  • Cut in excel doesn't work, and copying tables from one program to another returns text

    - by Kristina
    My excel 2007 on Windows 7 operating system seems to have a probelm with regular cut function. when I highlight cells I want to cut and press cut (either on keyboard shortcut Ctrl+x, Home menu cut command, or from the right-click menu) cells start flashing for a split second and after that they only turn normal. When I want to paste them, they past as if copy function was used. If I try to rightclick to use function "insert cut cells" it is not one of the offered options at all. On my home computer I have same combination, Excel 2007 on windows 7 and it works just fine. COuld the problem be due to 64-bit win7 version at my job, and 32-bit version at home? Another problem is when I copy table from excel to word, in word pasting results in unformatted text instead of table as it was in excel. Did someone have such problems and can offer a solution? Thanx a lot.

    Read the article

  • Separate tables or single table with queries?

    - by Joe
    I'm making an employee information database. I need to handle separated employees. Should I a. set up a query with a macro to send separated employees to a separate table, or b. just add a flag to the single table denoting separation? I understand that it's best practice to take choice b, and the one reason I can think of for this is that any structural changes I make to the table later will have to be done in both places. But it also seems like setting up a flag forces me to filter out that flag for basically every useful query I'm going to make in the future.

    Read the article

  • New SQL Azure Development Accelerator Core promotional offer announced

    - by Eric Nelson
    This is (almost) a straight copy and paste but represents an important announcement worthy of a little more “exposure” :-) Starting August 1, 2010, we will release a new SQL Azure Development Accelerator Core promotional offer.  This new offer will give you the flexibility to purchase commitment quantities of SQL Azure Business Edition databases independent of other Windows Azure platform services at a deeply discounted monthly price.  The offer is valid only for a six month term.  You may purchase in 10 GB increments the amount of our Business Edition relational database that you require (each Business Edition database is capable of storing up to 50 GB).  The offer price will be $74.95 per 10 GB per month.  This promotional offer represents 25% off of our normal consumption rates.  Monthly Business Edition relational database usage exceeding the purchased commitment amount and usage for other Windows Azure platform services for this offer will be charged at our normal consumption rates.  Please click here for full details of our new SQL Azure Development Accelerator Core offer.  Related Links: Details of 5GB and 50GB databases have been released http://ukazure.ning.com UK community site Getting started with the Windows Azure Platform

    Read the article

  • New partnership allows auto-transposition of client/server application to Windows Azure

    - by Webgui
    The economics of IT is changing rapidly, and organizations are searching to widen and secure availability of their systems and at the same time lower costs which is exactly what the cloud meant to do. Running your systems on Microsoft’s Windows Azure cloud for example would improve and secure the availability, accessibility and scalability (both up and down) of your systems and support the new IT economics. However, in order to take advantage of the cloud's promise of lower cost of ownership, the applications must be built or adjusted to work on that platform and in most cases this is not a simple task.  Even existing web applications cannot always be transferred to Azure without some changes, and for client/server applications, the task is way more challenging even to the point where it seems impossible. The reason is the gaps between the client/server desktop technology and the cloud's. For that reason, most of the known methodologies to migrate existing client/server applications actually involve rewrite of the desktop systems for the cloud. A unique approach is introduced by Visual WebGui which creates a virtualization layer atop ASP.Net web server, it moves the transformed or generated .Net code to that layer, and then using a patent pending protocol it renders a user interface within a plain browser. The end result is pure .NET code that is a base code for a pure rich web application and now due to a collaboration with Microsoft Windows Azure Visual WebGui provides the shortest path from client/server to the Azure cloud by being able to handle close to 95% of the transformation to the cloud platform in an automatic way. Application Migration to Azure without migraines More information about the Instant CloudMove Azure solution here.

    Read the article

  • Windows Azure Event

    - by Blog Author
    Get cloud ready with Windows Azure The cloud is everywhere and here at Microsoft we’re flying high with our cloud computing release, Windows Azure. As most of you saw at the Professional Developers Conference, the reaction to Windows Azure has been nothing short of “wow” – and based on your feedback, we’ve organized this special, all-day Windows Azure Firestarter event to help you take full advantage of the cloud. Maybe you've already watched a webcast, attended a recent MSDN Event on the topic, or done your own digging on Azure. Well, here's your chance to go even deeper. This one-of-a-kind event will focus on helping developers get ‘cloud ready’ with concrete details and hands-on tactics. We’ll start by revealing Microsoft’s strategic vision for the cloud, and then offer an end-to-end view of the Windows Azure platform from a developer’s perspective. We’ll also talk about migrating your data and existing applications (regardless of platform) onto the cloud. We’ll finish up with an open panel and lots of time to ask questions. Following this event, please join us for an engaging conversation about any and all Cloud Computing topics. This FREE event is hosted by Northwest Cloud, the cloud agnostic community group, and sponsored by Microsoft. http://www.nwcloud.org/redmond/2010-04-06

    Read the article

  • Feature Updates to the Windows Azure Portal

    - by Clint Edmonson
    Lots of activity over at the Windows Azure portal this weekend, including some exciting new features and major improvements to existing features. Here are the highlights: Support for Managing Co-administrators Set up account co-administrators to allow others to share service management duties for each Azure subscription Import/Export support for SQL Databases Export existing SQL Azure databases to blob storage using SQL Server 2012’s BACPAC format. Create a new SQL Azure database from an existing BACPAC stored in blob storage Storage Container Management and Access Control Create blob storage containers directly within the portal Edit their public/private access settings Drill into storage containers and see the blobs contained within them Improved Cloud Service Status Notifications Detailed health status information about cloud services and roles as they transition between states Virtual Machine Experience Enhancements Option to automatically delete corresponding VHD files from blob storage when deleting VM disks Service Bus Management and Monitoring Ability to create and manage service bus Namespaces, Queues, Topics, Relays and Subscriptions Rich monitoring of Topics, Queues, and Subscriptions with detailed and customizable dashboard metrics Entity status (Topic, Queue, or Subscription) can be changed interactively via dashboard Direct links to the Access Control Services (ACS) namespaces when working with service bus access keys Media Services Monitoring Support Monitor encoding jobs that are queued for processing as well as active, failed and queued tasks for encoding jobs The above features are all now live in production and available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today. Stay tuned to my twitter feed for Windows Azure announcements, updates, and links: @clinted Reference ID: P7VVJCM38V8R

    Read the article

  • MSBuild / PowerShell: Copy SQL Server 2012 database to SQL Azure via BACPAC (for Continuous Integration)

    - by giveme5minutes
    I'm creating a continuous integration MSBuild script which copies a database in on-premise SQL Server 2012 to SQL Azure. Easy right? Methods After a fair bit of research I've come across the following methods: Use PowerShell to access the DAC library directly, then use the MSBuild PowerShell extension to wrap the script. This would require installing PowerShell 3 and working out how to make the MSBuild PowerShell extension work with it, as apparently MS moved the DAC API to a different namespace in the latest version of the library. PowerShell would give direct access to the API, but may require quite a bit of boilerplate. Use the sample DAC Framework Client Side Tools, which requires compiling them myself, as the downloads available from Codeplex only include the Hosted version. It would also require fixing them to use DAC 3.0 classes as they appear to currently use an earlier version of DAC. I could then call these tools from an <Exec Command="" /> in the MSBuild script. Less boilerplate and if I hit any bumps in the road I can just make changes to the source. Processes Using whichever method, the process could be either: Export from on-premise SQL Server 2012 to local BACPAC Upload BACPAC to blog storage Import BACPAC to SQL Azure via Hosted DAC Or: Export from on-premise SQL Server 2012 to local BACPAC Import BACPAC to SQL Azure via Client DAC Question All of the above seems to be quite a lot of effort for something that seems to be a standard feature... so before I start reinventing the wheel and documenting the results for all to see, is there something really obvious that I've missed here? Is there pre-written script that MS has released that I have not yet uncovered? There's an command in the GUI of SQL Server Management Studio 2012 that does EXACTLY what I'm trying to do (right click on local database, click "Tasks", click "Deploy Database to SQL Azure"). Surely if it's a few clicks in the GUI it must be a single command on the command line somewhere??

    Read the article

  • Windows Azure v1.7 Spring Release Today&ndash;New Management Dashboard

    - by ToStringTheory
    Today, Microsoft will be publicly releasing a new version of Azure for public consumption.  The web conference, at http://www.meetwindowsazure.com will be airing at 1 PM PST.  They have already released an update to the Service Dashboard that can be accessed by going to http://manage.windowsazure.com.  I have some images of the new dashboard here that I have gathered and removed any PII from.  Let me know what you think! Images You should be able to click any of the images for a full resolution image. Tutorial The first thing you get after signing in is the tutorial: Landing After the tutorial completes, you get a screen with services that are active on your account on the left, and a list of ALL services (db/blob/SQL Azure) on the right.  I like the quick access to services across any of my subscriptions: Service Information These are images from a running web site with several roles.  I love how easy they have made many of the features: SQL Azure They have given some great quick functionality for looking at your DB information: Storage Here is the basic information that they give you for any storage accounts you have: Adding Services Super quick and easy to add services with the new UI: Conclusion I am EXCITED!  As you may have seen in the left side of my blog, I am an MCPD in Azure Development, and I must say that I am excited to see Microsoft moving forward with the technology and not letting it stagnate.  After as much as I have fought the other Azure dashboard, I like the friendliness and fluidity of this one. The important thing to note about ALL of the images above: this is HTML, not Silverlight.  The responsiveness is FAST on all of the actions I completed, and I believe that this is a big step forward for Azure… So, what do you think?

    Read the article

  • Windows Azure Recipe: Big Data

    - by Clint Edmonson
    As the name implies, what we’re talking about here is the explosion of electronic data that comes from huge volumes of transactions, devices, and sensors being captured by businesses today. This data often comes in unstructured formats and/or too fast for us to effectively process in real time. Collectively, we call these the 4 big data V’s: Volume, Velocity, Variety, and Variability. These qualities make this type of data best managed by NoSQL systems like Hadoop, rather than by conventional Relational Database Management System (RDBMS). We know that there are patterns hidden inside this data that might provide competitive insight into market trends.  The key is knowing when and how to leverage these “No SQL” tools combined with traditional business such as SQL-based relational databases and warehouses and other business intelligence tools. Drivers Petabyte scale data collection and storage Business intelligence and insight Solution The sketch below shows one of many big data solutions using Hadoop’s unique highly scalable storage and parallel processing capabilities combined with Microsoft Office’s Business Intelligence Components to access the data in the cluster. Ingredients Hadoop – this big data industry heavyweight provides both large scale data storage infrastructure and a highly parallelized map-reduce processing engine to crunch through the data efficiently. Here are the key pieces of the environment: Pig - a platform for analyzing large data sets that consists of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs. Mahout - a machine learning library with algorithms for clustering, classification and batch based collaborative filtering that are implemented on top of Apache Hadoop using the map/reduce paradigm. Hive - data warehouse software built on top of Apache Hadoop that facilitates querying and managing large datasets residing in distributed storage. Directly accessible to Microsoft Office and other consumers via add-ins and the Hive ODBC data driver. Pegasus - a Peta-scale graph mining system that runs in parallel, distributed manner on top of Hadoop and that provides algorithms for important graph mining tasks such as Degree, PageRank, Random Walk with Restart (RWR), Radius, and Connected Components. Sqoop - a tool designed for efficiently transferring bulk data between Apache Hadoop and structured data stores such as relational databases. Flume - a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large log data amounts to HDFS. Database – directly accessible to Hadoop via the Sqoop based Microsoft SQL Server Connector for Apache Hadoop, data can be efficiently transferred to traditional relational data stores for replication, reporting, or other needs. Reporting – provides easily consumable reporting when combined with a database being fed from the Hadoop environment. Training These links point to online Windows Azure training labs where you can learn more about the individual ingredients described above. Hadoop Learning Resources (20+ tutorials and labs) Huge collection of resources for learning about all aspects of Apache Hadoop-based development on Windows Azure and the Hadoop and Windows Azure Ecosystems SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • T-SQL Dynamic SQL and Temp Tables

    - by George
    It looks like #temptables created using dynamic SQL via the EXECUTE string method have a different scope and can't be referenced by "fixed" SQLs in the same stored procedure. However, I can reference a temp table created by a dynamic SQL statement in a subsequence dynamic SQL but it seems that a stored procedure does not return a query result to a calling client unless the SQL is fixed. A simple 2 table scenario: I have 2 tables. Let's call them Orders and Items. Order has a Primary key of OrderId and Items has a Primary Key of ItemId. Items.OrderId is the foreign key to identify the parent Order. An Order can have 1 to n Items. I want to be able to provide a very flexible "query builder" type interface to the user to allow the user to select what Items he want to see. The filter criteria can be based on fields from the Items table and/or from the parent Order table. If an Item meets the filter condition including and condition on the parent Order if one exists, the Item should be return in the query as well as the parent Order. Usually, I suppose, most people would construct a join between the Item table and the parent Order tables. I would like to perform 2 separate queries instead. One to return all of the qualifying Items and the other to return all of the distinct parent Orders. The reason is two fold and you may or may not agree. The first reason is that I need to query all of the columns in the parent Order table and if I did a single query to join the Orders table to the Items table, I would be repoeating the Order information multiple times. Since there are typically a large number of items per Order, I'd like to avoid this because it would result in much more data being transfered to a fat client. Instead, as mentioned, I would like to return the two tables individually in a dataset and use the two tables within to populate a custom Order and child Items client objects. (I don't know enough about LINQ or Entity Framework yet. I build my objects by hand). The second reason I would like to return two tables instead of one is because I already have another procedure that returns all of the Items for a given OrderId along with the parent Order and I would like to use the same 2-table approach so that I could reuse the client code to populate my custom Order and Client objects from the 2 datatables returned. What I was hoping to do was this: Construct a dynamic SQL string on the Client which joins the orders table to the Items table and filters appropriate on each table as specified by the custom filter created on the Winform fat-client app. The SQL build on the client would have looked something like this: TempSQL = " INSERT INTO #ItemsToQuery OrderId, ItemsId FROM Orders, Items WHERE Orders.OrderID = Items.OrderId AND /* Some unpredictable Order filters go here */ AND /* Some unpredictable Items filters go here */ " Then, I would call a stored procedure, CREATE PROCEDURE GetItemsAndOrders(@tempSql as text) Execute (@tempSQL) --to create the #ItemsToQuery table SELECT * FROM Items WHERE Items.ItemId IN (SELECT ItemId FROM #ItemsToQuery) SELECT * FROM Orders WHERE Orders.OrderId IN (SELECT DISTINCT OrderId FROM #ItemsToQuery) The problem with this approach is that #ItemsToQuery table, since it was created by dynamic SQL, is inaccessible from the following 2 static SQLs and if I change the static SQLs to dynamic, no results are passed back to the fat client. 3 around come to mind but I'm look for a better one: 1) The first SQL could be performed by executing the dynamically constructed SQL from the client. The results could then be passed as a table to a modified version of the above stored procedure. I am familiar with passing table data as XML. If I did this, the stored proc could then insert the data into a temporary table using a static SQL that, because it was created by dynamic SQL, could then be queried without issue. (I could also investigate into passing the new Table type param instead of XML.) However, I would like to avoid passing up potentially large lists to a stored procedure. 2) I could perform all the queries from the client. The first would be something like this: SELECT Items.* FROM Orders, Items WHERE Order.OrderId = Items.OrderId AND (dynamic filter) SELECT Orders.* FROM Orders, Items WHERE Order.OrderId = Items.OrderId AND (dynamic filter) This still provides me with the ability to reuse my client sided object-population code because the Orders and Items continue to be returned in two different tables. I have a feeling to, that I might have some options using a Table data type within my stored proc, but that is also new to me and I would appreciate a little bit of spoon feeding on that one. If you even scanned this far in what I wrote, I am surprised, but if so, I woul dappreciate any of your thoughts on how to accomplish this best.

    Read the article

  • Windows Azure Platform, latest version?

    - by Vimvq1987
    I searched through internet but found nothing. The whitepapers of Windows Azure Platform say something like that: In its first release, the maximum size of a single database in SQL Azure Database is 10 gigabytes A few things are omitted in the technology’s first release, however, such as the SQL Common Language Runtime (CLR) and support for spatial data. (Microsoft says that both will be available in a future version.) I want to know that Microsoft had updated Windows Azure Platform and removed these limits or not? I decided to post this question here instead of Serverfault.com because it's more relative to programming than administration. Thank you

    Read the article

  • Maximum number of workable tables in SQL Server And MySQL

    - by Kibbee
    I know that in SQL Server, the maximum number of "objects" in a database is a little over 2 billion. Objects contains tables, views, stored procedures, indexes, among other things . I'm not at all worried about going beyond 2 billion objects. However, what I would like to know, is, does SQL Server suffer a performance hit from having a large number of tables. Does each table you add have a performance hit, or is there basically no difference (assuming constant amount of data). Does anybody have any experience working with databases with thousands of tables? I'm also wondering the same about MySQL.

    Read the article

  • Windows Azure: need to know the data processing time

    - by veda
    I have stored some files in the form of blobs on azure and I have written an application that would access these blobs. When I host this application as a web role on azure, it works perfectly and I am happy with that. But now, I wanted to know “what is the query time taken to access each blob file?” I was searching for this through the Microsoft Azure Storage SLA and I found that for GetBlob request type, the maximum processing time should be within the product of 2 seconds multiplied by the number of MBs transferred in processing the request. I am still unclear. What is the actual processing time of my data query? How can I measure it? Can I be able to speed up the processing time? I can understand that the processing time depends on internet speed, location of the data center where my data is being stored, and location of data center where my application is being hosted. But still, will I be able to speed up my query?

    Read the article

  • Easiest way to retrofit retry logic on LINQ to SQL migration to SQL Azure

    - by Pat James
    I have a couple of existing ASP .NET web forms and MVC applications that currently use LINQ to SQL with a SQL Server 2008 Express database on a Windows VPS: one VPS for both IIS and SQL. I am starting to outgrow the VPS's ability to effectively host both SQL and IIS and am getting ready to split them up. I am considering migrating the database to SQL Azure and keeping IIS on the VPS. After doing initial research it sounds like implementing retry logic in the data access layer is a must-do when adopting SQL Azure. I suspect this is even more critical to implement in my situation where IIS will be on a VPS outside of the Azure infrastructure. I am looking for pointers on how to do this with the least effort and impact on my existing code base. Is there a good retry pattern that can be applied once at the LINQ to SQL data access layer, as opposed to having to wrap all of my LINQ to SQL operations in try/catch/wait/retry logic?

    Read the article

  • Windows Azure worker roles: One big job or many small jobs?

    - by Ryan Elkins
    Is there any inherent advantage when using multiple workers to process pieces of procedural code versus processing the entire load? In other words, if my workflow looks like this: Get work from queue0 and do A Store result from A in queue1 Get result from queue 1 and do B Store result from B in queue2 Get result from queue2 and do C Is there an inherent advantage to using 3 workers who each do the entire process themselves versus 3 workers that each do a part of the work (Worker 1 does 1 & 2, worker 2 does 3 & 4, worker 3 does 5). If we only care about working being done (finished with step 5) it would seem that it scales the same way (once you're using at least 3 workers). Maybe the big job is better because workers with that setup have less bottleneck issues?

    Read the article

  • Windows Azure Table Storage LINQ Operators

    - by Ryan Elkins
    Currently Table Storage supports From, Where, Take, and First. Are there plans to support any of the other 29 operators? If we have to code for these ourselves, how much of a performance difference are we looking at to something similar via SQL and SQL Server? Do you see it being somewhat comparable or will it be far far slower if I need to do a Count or Sum or Group By over a gigantic dataset? I like the Azure platform and the idea of cloud based storage. I like Windows Azure for the amount of data it can store and the schema-less nature of table storage. SQL Azure just won't work due to the high cost to storage space.

    Read the article

  • "Initializing - Busy - Stopping" LOOP issue in Azure deployement

    - by Kushal Waikar
    Hi folks, I am trying to deploy an azure cloud application on Windows Azure. Application specifications are -- It has one WebRole - ASP.Net MVC Application (ASP.Net charting control is used in this MVC application) It does not contain any worker role. Third party references are set with property "copy Local" to "true"(MVC,ASP Charting control & ASP Provider DLLs) There is no DiagnosticsConnectionString in service configuration file It uses ASP provider for session state management. This application runs successfully on local dev fabric but when I try to deploy it on Windows Azure it gets stuck in a loop with status being changed between Initializing, Busy, Stopping states. It never goes into READY state. It seems that there are no ERROR logs for conveying the deployment issues to user. So is there any way to diagnose deployment issues ? Is there any way to get deployment ERROR logs ? Any kind of help will be appreciated. Thanks, Kushal

    Read the article

  • Error after updating to the latest version Azure SDK

    - by Mikael Johansson
    After I updated to the newest version the Azure SDK I have started to get this error several times each day when I press build in Visual Studio. The only way for me to fix it at the moment is to restart my visual studio. The error I get is: Windows Azure Tools: Invalid access to memory location Is there someone else that have got this error? And also what did you do to fix it? Thanks in advance! Update 2012-08-28: The same error still exist in VS2012 and Azure 1.7 SDK. However the frequency have gone down with VS2012.

    Read the article

  • Replacement for Azure SDK Powershell commandlets for deployments

    - by Frank Rosario
    Hi, We've run into an issue with the New-Deployment Azure Powershell commandlet timing out; we've put in a bug report with MS. While they gave us an explanation for it (the path and timeout threshold used to upload through commandlets is different then what's used by the web portal); they don't have a fix for us. We need to get this running so we can automate our build deployments, so we're looking into developing a custom commandlet to replace New-Deployment using the Azure SDK; hoping this path will not have the timeout issues the commandlet did. But before we go down that route; are there any other scriptable tools I can use to replace the New-Deployment functionality? I looked at Cloudberry for Windows Azure; but that doesn't have a scriptable interface yet. Any constructive input is greatly appreciated.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >