Search Results

Search found 13842 results on 554 pages for 'reference identity'.

Page 157/554 | < Previous Page | 153 154 155 156 157 158 159 160 161 162 163 164  | Next Page >

  • Big Update for the Project Rosetta Site

    We just shipped a major update to the Project Rosetta site, including a new a series of Flash to Silverlight tutorials, an updated API Guide with a quick reference list and a full list of recommended tools, code samples, and frameworks to download....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to use Azure storage for uploading and displaying pictures.

    - by Magnus Karlsson
    Basic set up of Azure storage for local development and production. This is a somewhat completion of the following guide from http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/ that also involves a practical example that I believe is commonly used, i.e. upload and present an image from a user.   First we set up for local storage and then we configure for them to work on a web role. Steps: 1. Configure connection string locally. 2. Configure model, controllers and razor views.   1. Setup connectionsstring 1.1 Right click your web role and choose “Properties”. 1.2 Click Settings. 1.3 Add setting. 1.4 Name your setting. This will be the name of the connectionstring. 1.5 Click the ellipsis to the right. (the ellipsis appear when you mark the area. 1.6 The following window appears- Select “Windows Azure storage emulator” and click ok.   Now we have a connection string to use. To be able to use it we need to make sure we have windows azure tools for storage. 2.1 Click Tools –> Library Package manager –> Manage Nuget packages for solution. 2.2 This is what it looks like after it has been added.   Now on to what the code should look like. 3.1 First we need a view which collects images to upload. Here Index.cshtml. 1: @model List<string> 2:  3: @{ 4: ViewBag.Title = "Index"; 5: } 6:  7: <h2>Index</h2> 8: <form action="@Url.Action("Upload")" method="post" enctype="multipart/form-data"> 9:  10: <label for="file">Filename:</label> 11: <input type="file" name="file" id="file1" /> 12: <br /> 13: <label for="file">Filename:</label> 14: <input type="file" name="file" id="file2" /> 15: <br /> 16: <label for="file">Filename:</label> 17: <input type="file" name="file" id="file3" /> 18: <br /> 19: <label for="file">Filename:</label> 20: <input type="file" name="file" id="file4" /> 21: <br /> 22: <input type="submit" value="Submit" /> 23: 24: </form> 25:  26: @foreach (var item in Model) { 27:  28: <img src="@item" alt="Alternate text"/> 29: } 3.2 We need a controller to receive the post. Notice the “containername” string I send to the blobhandler. I use this as a folder for the pictures for each user. If this is not a requirement you could just call it container or anything with small characters directly when creating the container. 1: public ActionResult Upload(IEnumerable<HttpPostedFileBase> file) 2: { 3: BlobHandler bh = new BlobHandler("containername"); 4: bh.Upload(file); 5: var blobUris=bh.GetBlobs(); 6: 7: return RedirectToAction("Index",blobUris); 8: } 3.3 The handler model. I’ll let the comments speak for themselves. 1: public class BlobHandler 2: { 3: // Retrieve storage account from connection string. 4: CloudStorageAccount storageAccount = CloudStorageAccount.Parse( 5: CloudConfigurationManager.GetSetting("StorageConnectionString")); 6: 7: private string imageDirecoryUrl; 8: 9: /// <summary> 10: /// Receives the users Id for where the pictures are and creates 11: /// a blob storage with that name if it does not exist. 12: /// </summary> 13: /// <param name="imageDirecoryUrl"></param> 14: public BlobHandler(string imageDirecoryUrl) 15: { 16: this.imageDirecoryUrl = imageDirecoryUrl; 17: // Create the blob client. 18: CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 19: 20: // Retrieve a reference to a container. 21: CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl); 22: 23: // Create the container if it doesn't already exist. 24: container.CreateIfNotExists(); 25: 26: //Make available to everyone 27: container.SetPermissions( 28: new BlobContainerPermissions 29: { 30: PublicAccess = BlobContainerPublicAccessType.Blob 31: }); 32: } 33: 34: public void Upload(IEnumerable<HttpPostedFileBase> file) 35: { 36: // Create the blob client. 37: CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 38: 39: // Retrieve a reference to a container. 40: CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl); 41: 42: if (file != null) 43: { 44: foreach (var f in file) 45: { 46: if (f != null) 47: { 48: CloudBlockBlob blockBlob = container.GetBlockBlobReference(f.FileName); 49: blockBlob.UploadFromStream(f.InputStream); 50: } 51: } 52: } 53: } 54: 55: public List<string> GetBlobs() 56: { 57: // Create the blob client. 58: CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 59: 60: // Retrieve reference to a previously created container. 61: CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl); 62: 63: List<string> blobs = new List<string>(); 64: 65: // Loop over blobs within the container and output the URI to each of them 66: foreach (var blobItem in container.ListBlobs()) 67: blobs.Add(blobItem.Uri.ToString()); 68: 69: return blobs; 70: } 71: } 3.4 So, when the files have been uploaded we will get them to present them to out user in the index page. Pretty straight forward. In this example we only present the image by sending the Uri’s to the view. A better way would be to save them up in a view model containing URI, metadata, alternate text, and other relevant information but for this example this is all we need.   4. Now press F5 in your solution to try it out. You can see the storage emulator UI here:     4.1 If you get any exceptions or errors I suggest to first check if the service Is running correctly. I had problem with this and they seemed related to the installation and a reboot fixed my problems.     5. Set up for Cloud storage. To do this we need to add configuration for cloud just as we did for local in step one. 5.1 We need our keys to do this. Go to the windows Azure menagement portal, select storage icon to the right and click “Manage keys”. (Image from a different blog post though).   5.2 Do as in step 1.but replace step 1.6 with: 1.6 Choose “Manually entered credentials”. Enter your account name. 1.7 Paste your Account Key from step 5.1. and click ok.   5.3. Save, publish and run! Please feel free to ask any questions using the comments form at the bottom of this page. I will get back to you to help you solve any questions. Our consultancy agency also provides services in the Nordic regions if you would like any further support.

    Read the article

  • Extending WikiPlex with Scope Augmenters

    - by mhawley
    [In addition to blogging, I am also using Twitter. Follow me: @matthawley] Another extension point with WikiPlex is Scope Augmenters. Scope Augmenters allow you to post process the collection of scopes to further augment, or insert/remove, new scopes prior to being rendered. WikiPlex comes with 3 out-of-the-box Scope Augmenters that it uses for indentation, tables, and lists. For reference, I'll be explaining… (read more)

    Read the article

  • ASP.NET MVC 3 Hosting :: How to Upgrade ASP.NET MVC 2 Project to ASP.NET MVC 3

    - by mbridge
    ASP.NET MVC 3 can be installed side by side with ASP.NET MVC 2 on the same computer, which gives you flexibility in choosing when to upgrade an ASP.NET MVC 2 application to ASP.NET MVC 3. The simplest way to upgrade is to create a new ASP.NET MVC 3 project and copy all the views, controllers, code, and content files from the existing MVC 2 project to the new project and then to update the assembly references in the new project to match the old project. If you have made changes to the Web.config file in the MVC 2 project, you must also merge those changes with the Web.config file in the MVC 3 project. To manually upgrade an existing ASP.NET MVC 2 application to version 3, do the following: 1. In both Web.config files in the MVC 3 project, globally search and replace the MVC version. Find the following: System.Web.Mvc, Version=2.0.0.0 Replace it with the following System.Web.Mvc, Version=3.0.0.0 There are three changes in the root Web.config and four in the Views\Web.config file. 2. In Solution Explorer, delete the reference to System.Web.Mvc (which points to the version 2 DLL). Then add a reference to System.Web.Mvc (v3.0.0.0). 3. In Solution Explorer, right-click the project name and then select Unload Project. Then right-click again and select Edit ProjectName.csproj. 4. Locate the ProjectTypeGuids element and replace {F85E285D-A4E0-4152-9332-AB1D724D3325} with {E53F8FEA-EAE0-44A6-8774-FFD645390401}. 5. Save the changes and then right-click the project and select Reload Project. 6. If the project references any third-party libraries that are compiled using ASP.NET MVC 2, add the following highlighted bindingRedirect element to the Web.config file in the application root under the configuration section: <runtime>   <assemblyBinding >     <dependentAssembly>       <assemblyIdentity name="System.Web.Mvc"           publicKeyToken="31bf3856ad364e35"/>       <bindingRedirect oldVersion="2.0.0.0" newVersion="3.0.0.0"/>     </dependentAssembly>   </assemblyBinding> </runtime> Another ASP.NET MVC 3 article: - Rolling with Razor in MVC v3 Preview - Deploying ASP.NET MVC 3 web application to server where ASP.NET MVC 3 is not installed - RenderAction with ASP.NET MVC 3 Sessionless Controllers

    Read the article

  • Mounting a Microsoft Azure CloudDrive in a VMRole

    - by SeanBarlow
    Mounting a Drive in a VMRole is a little more complicated then a web or worker.  The Web and Worker roles offer OnStart and OnStop events, which you can use to mount or unmount your drives. The VMRole does not have these same events so you have to provide another way for the drives to be mounted or unmounted. The problem I have run into is what if you have multiple drives and you only want to mount certain drives. How do you let your user mount the drive. I am not going to go into details on what kind of GUI to present to the user. I have done this in a simple WPF application as well as a console application. We are going to need to get the storage account details. One thing to note when you are mounting cloud drives you cannot use https and have to use http. We force the use of http by using false when we create the CloudStorageAccount.   StorageCredentialsAccountAndKey credentials = new StorageCredentialsAccountAndKey("AccountName", "AccountKey"); CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, false);   Next we need to get a reference to the container.   var blobClient = storageAccount.CreateCloudBlobClient(); var container = blobClient.GetContainerReference("ContainerName");   Now we need to get a list of the drives in the container   var drives = container.ListBlobs();   Now that we have a list of the drives in the container we can let the user choose which drive they want to mount. I am just selecting the 1st drive in the list for the example and getting the Uri of the drive.   var driveUri = drives.First().Uri;   Now that we have the Uri we need to get the reference to the drive. var drive = new CloudDrive(driveUri, storageAccount.Credentials);   Now all that is left is to mount the drive.   var driveLetter = drive.Mount(0, DriveMountOptions.None);   To unmount the drive all you have to do is call unmount on the drive. drive.Unmount();   You do need to make sure you unount the drives when you are done with them. I have run into issues with the drives being locked until the VMRole is rebooted. I have also managed to have a drive be permanently locked and I was forced to delete it and upload it again. I have been unable to reproduce the permanent lock but I am still trying. The CloudDrive class provides a handy method to retrieve all the mounted drives in the Role. foreach (var drive in CloudDrive.GetMountedDrives()) {          var mountedDrive = Account.CreateCloudDrive(drive.Value.PathAndQuery);          mountedDrive.Unmount(); }

    Read the article

  • Windows Azure Recipe: Enterprise LOBs

    - by Clint Edmonson
    Enterprises are more and more dependent on their specialized internal Line of Business (LOB) applications than ever before. Naturally, the more software they leverage on-premises, the more infrastructure they need manage. It’s frequently the case that our customers simply can’t scale up their hardware purchases and operational staff as fast as internal demand for software requires. The result is that getting new or enhanced applications in the hands of business users becomes slower and more expensive every day. Being able to quickly deliver applications in a rapidly changing business environment while maintaining high standards of corporate security is a challenge that can be met right now by moving enterprise LOBs out into the cloud and leveraging Azure’s Access Control services. In fact, we’re seeing many of our customers (both large and small) see huge benefits from moving their web based business applications such as corporate help desks, expense tracking, travel portals, timesheets, and more to Windows Azure. Drivers Cost Reduction Time to market Security Solution Here’s a sketch of how many Windows Azure Enterprise LOBs are being architected and deployed: Ingredients Web Role – this will host the core of the application. Each web role is a virtual machine hosting an application written in ASP.NET (or optionally php, or node.js). The number of web roles can be scaled up or down as needed to handle peak and non-peak traffic loads. Many Java based applications are also being deployed to Windows Azure with a little more effort. Database – every modern web application needs to store data. SQL Azure databases look and act exactly like their on-premise siblings but are fault tolerant and have data redundancy built in. Access Control – this service is necessary to establish federated identity between the cloud hosted application and an enterprise’s corporate network. It works in conjunction with a secure token service (STS) that is hosted on-premises to establish the corporate user’s identity and credentials. The source code for an on-premises STS is provided in the Windows Azure training kit and merely needs to be customized for the corporate environment and published on a publicly accessible corporate web site. Once set up, corporate users see a near seamless single sign-on experience. Reporting – businesses live and die by their reports and SQL Azure Reporting, based on SQL Server Reporting 2008 R2, can serve up reports with tables, charts, maps, gauges, and more. These reports can be accessed from the Windows Azure Portal, through a web browser, or directly from applications. Service Bus (optional) – if deep integration with other applications and systems is needed, the service bus is the answer. It enables secure service layer communication between applications hosted behind firewalls in on-premises or partner datacenters and applications hosted inside Windows Azure. The Service Bus provides the ability to securely expose just the information and services that are necessary to create a simpler, more secure architecture than opening up a full blown VPN. Data Sync (optional) – in cases where the data stored in the cloud needs to be shared internally, establishing a secure one-way or two-way data-sync connection between the on-premises and off-premises databases is a perfect option. It can be very granular, allowing us to specify exactly what tables and columns to synchronize, setup filters to sync only a subset of rows, set the conflict resolution policy for two-way sync, and specify how frequently data should be synchronized Training Labs These links point to online Windows Azure training labs where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Sony PSM SDK's 2D game engine

    - by Notbad
    I have started with the Sony PSM SDK this week, I'm interested in creating a little 2D game and have been reading through the web about a so called "2D game engine" integrated into the SDK. Some information I read suggested that it was added on January 2012, but I have been going through the documentation and haven't been able to find any reference to it. Does anybody know if they finally introduced the 2D game engine for the PSM SDK? Thanks in advance.

    Read the article

  • An XEvent a Day (23 of 31) – How it Works – Multiple Transaction Log Files

    - by Jonathan Kehayias
    While working on yesterday’s blog post The Future – fn_dblog() No More? Tracking Transaction Log Activity in Denali I did a quick Google search to find a specific blog post by Paul Randal to use it as a reference, and in the results returned another blog post titled, Investigating Multiple Transaction Log Files in SQL Server caught my eye so I opened it in a new tab in IE and went about finishing the blog post.  It probably wouldn’t have gotten my attention if it hadn’t been on the SqlServerPedia...(read more)

    Read the article

  • SSIS Expression Editor & Tester

    Published today on CodePlex is the SSIS Expression Editor & Tester project. If you want to try it just pop over to CodePlex and download it. About five years ago I developed my own expression editor control. It first got used in our custom tasks as the MS editor didn’t become available until SQL 2005 SP1, but even then it had some handy features I preferred. For example resizable panes so that if your expression result was more than two lines you could see them all. It also meant I could change the functions available in the tree view, the most obvious use being to add some handy snippets and samples that I used a lot. This quickly developed into a small expression testing tool. I’d develop complex expressions using my editor and then copy it back into the package itself. I have been meaning to make the tool available for some time and finally made the effort, the code is checked-in and the signed downloads are published on CodePlex. There are two flavours, SQL 2005 or 2008, and just a simple zip file to download and extract. The tool doesn’t need installing, and is completely portable. It does need SSIS to be installed on the local machine though. Each zip file contains two files: ExpressionTester.exe – The tool itself, run this. ExpressionEditor.dll – The reusable editor control. A while ago the gentlemen behind BIDS Helper noticed the editor on a task and asked about using it. This became incorporated into their variable window extensions feature. To try and help them and anyone else that wants to use the editor control, it is available as a single assembly that you can reference yourself, and of course all the source code is on CodePlex too. Just add a reference to the ExpressionEditor.dll assembly and you should be up and running in no time. There is a sample project Package Test in the source code which shows how to use the editor control form in it’s simplest form, or if you want to host control directly then the tester tool is a perfect example.

    Read the article

  • Matrix rotation of a rectangle to "face" a given point in 2d

    - by justin.m.chase
    Suppose you have a rectangle centered at point (0, 0) and now I want to rotate it such that it is facing the point (100, 100), how would I do this purely with matrix math? To give some more specifics I am using javascript and canvas and I may have something like this: var position = {x : 0, y: 0 }; var destination = { x : 100, y: 100 }; var transform = Matrix.identity(); this.update = function(state) { // update transform to rotate to face destination }; this.draw = function(ctx) { ctx.save(); ctx.transform(transform); // a helper that just calls setTransform() ctx.beginPath(); ctx.rect(-5, -5, 10, 10); ctx.fillStyle = 'Blue'; ctx.fill(); ctx.lineWidth = 2; ctx.stroke(); ctx.restore(); } Feel free to assume any matrix function you need is available.

    Read the article

  • Oracle Database 12 c New Partition Maintenance Features by Gwen Lazenby

    - by hamsun
    One of my favourite new features in Oracle Database 12c is the ability to perform partition maintenance operations on multiple partitions. This means we can now add, drop, truncate and merge multiple partitions in one operation, and can split a single partition into more than two partitions also in just one command. This would certainly have made my life slightly easier had it been available when I administered a data warehouse at Oracle 9i. To demonstrate this new functionality and syntax, I am going to create two tables, ORDERS and ORDERS_ITEMS which have a parent-child relationship. ORDERS is to be partitioned using range partitioning on the ORDER_DATE column, and ORDER_ITEMS is going to partitioned using reference partitioning and its foreign key relationship with the ORDERS table. This form of partitioning was a new feature in 11g and means that any partition maintenance operations performed on the ORDERS table will also take place on the ORDER_ITEMS table as well. First create the ORDERS table - SQL CREATE TABLE orders ( order_id NUMBER(12), order_date TIMESTAMP, order_mode VARCHAR2(8), customer_id NUMBER(6), order_status NUMBER(2), order_total NUMBER(8,2), sales_rep_id NUMBER(6), promotion_id NUMBER(6), CONSTRAINT orders_pk PRIMARY KEY(order_id) ) PARTITION BY RANGE(order_date) (PARTITION Q1_2007 VALUES LESS THAN (TO_DATE('01-APR-2007','DD-MON-YYYY')), PARTITION Q2_2007 VALUES LESS THAN (TO_DATE('01-JUL-2007','DD-MON-YYYY')), PARTITION Q3_2007 VALUES LESS THAN (TO_DATE('01-OCT-2007','DD-MON-YYYY')), PARTITION Q4_2007 VALUES LESS THAN (TO_DATE('01-JAN-2008','DD-MON-YYYY')) ); Table created. Now the ORDER_ITEMS table SQL CREATE TABLE order_items ( order_id NUMBER(12) NOT NULL, line_item_id NUMBER(3) NOT NULL, product_id NUMBER(6) NOT NULL, unit_price NUMBER(8,2), quantity NUMBER(8), CONSTRAINT order_items_fk FOREIGN KEY(order_id) REFERENCES orders(order_id) on delete cascade) PARTITION BY REFERENCE(order_items_fk) tablespace example; Table created. Now look at DBA_TAB_PARTITIONS to get details of what partitions we have in the two tables – SQL select table_name,partition_name, partition_position position, high_value from dba_tab_partitions where table_owner='SH' and table_name like 'ORDER_%' order by partition_position, table_name; TABLE_NAME PARTITION_NAME POSITION HIGH_VALUE -------------- --------------- -------- ------------------------- ORDERS Q1_2007 1 TIMESTAMP' 2007-04-01 00:00:00' ORDER_ITEMS Q1_2007 1 ORDERS Q2_2007 2 TIMESTAMP' 2007-07-01 00:00:00' ORDER_ITEMS Q2_2007 2 ORDERS Q3_2007 3 TIMESTAMP' 2007-10-01 00:00:00' ORDER_ITEMS Q3_2007 3 ORDERS Q4_2007 4 TIMESTAMP' 2008-01-01 00:00:00' ORDER_ITEMS Q4_2007 4 Just as an aside it is also now possible in 12c to use interval partitioning on reference partitioned tables. In 11g it was not possible to combine these two new partitioning features. For our first example of the new 12cfunctionality, let us add all the partitions necessary for 2008 to the tables using one command. Notice that the partition specification part of the add command is identical in format to the partition specification part of the create command as shown above - SQL alter table orders add PARTITION Q1_2008 VALUES LESS THAN (TO_DATE('01-APR-2008','DD-MON-YYYY')), PARTITION Q2_2008 VALUES LESS THAN (TO_DATE('01-JUL-2008','DD-MON-YYYY')), PARTITION Q3_2008 VALUES LESS THAN (TO_DATE('01-OCT-2008','DD-MON-YYYY')), PARTITION Q4_2008 VALUES LESS THAN (TO_DATE('01-JAN-2009','DD-MON-YYYY')); Table altered. Now look at DBA_TAB_PARTITIONS and we can see that the 4 new partitions have been added to both tables – SQL select table_name,partition_name, partition_position position, high_value from dba_tab_partitions where table_owner='SH' and table_name like 'ORDER_%' order by partition_position, table_name; TABLE_NAME PARTITION_NAME POSITION HIGH_VALUE -------------- --------------- -------- ------------------------- ORDERS Q1_2007 1 TIMESTAMP' 2007-04-01 00:00:00' ORDER_ITEMS Q1_2007 1 ORDERS Q2_2007 2 TIMESTAMP' 2007-07-01 00:00:00' ORDER_ITEMS Q2_2007 2 ORDERS Q3_2007 3 TIMESTAMP' 2007-10-01 00:00:00' ORDER_ITEMS Q3_2007 3 ORDERS Q4_2007 4 TIMESTAMP' 2008-01-01 00:00:00' ORDER_ITEMS Q4_2007 4 ORDERS Q1_2008 5 TIMESTAMP' 2008-04-01 00:00:00' ORDER_ITEMS Q1_2008 5 ORDERS Q2_2008 6 TIMESTAMP' 2008-07-01 00:00:00' ORDER_ITEM Q2_2008 6 ORDERS Q3_2008 7 TIMESTAMP' 2008-10-01 00:00:00' ORDER_ITEMS Q3_2008 7 ORDERS Q4_2008 8 TIMESTAMP' 2009-01-01 00:00:00' ORDER_ITEMS Q4_2008 8 Next, we can drop or truncate multiple partitions by giving a comma separated list in the alter table command. Note the use of the plural ‘partitions’ in the command as opposed to the singular ‘partition’ prior to 12c– SQL alter table orders drop partitions Q3_2008,Q2_2008,Q1_2008; Table altered. Now look at DBA_TAB_PARTITIONS and we can see that the 3 partitions have been dropped in both the two tables – TABLE_NAME PARTITION_NAME POSITION HIGH_VALUE -------------- --------------- -------- ------------------------- ORDERS Q1_2007 1 TIMESTAMP' 2007-04-01 00:00:00' ORDER_ITEMS Q1_2007 1 ORDERS Q2_2007 2 TIMESTAMP' 2007-07-01 00:00:00' ORDER_ITEMS Q2_2007 2 ORDERS Q3_2007 3 TIMESTAMP' 2007-10-01 00:00:00' ORDER_ITEMS Q3_2007 3 ORDERS Q4_2007 4 TIMESTAMP' 2008-01-01 00:00:00' ORDER_ITEMS Q4_2007 4 ORDERS Q4_2008 5 TIMESTAMP' 2009-01-01 00:00:00' ORDER_ITEMS Q4_2008 5 Now let us merge all the 2007 partitions together to form one single partition – SQL alter table orders merge partitions Q1_2005, Q2_2005, Q3_2005, Q4_2005 into partition Y_2007; Table altered. TABLE_NAME PARTITION_NAME POSITION HIGH_VALUE -------------- --------------- -------- ------------------------- ORDERS Y_2007 1 TIMESTAMP' 2008-01-01 00:00:00' ORDER_ITEMS Y_2007 1 ORDERS Q4_2008 2 TIMESTAMP' 2009-01-01 00:00:00' ORDER_ITEMS Q4_2008 2 Splitting partitions is a slightly more involved. In the case of range partitioning one of the new partitions must have no high value defined, and in list partitioning one of the new partitions must have no list of values defined. I call these partitions the ‘everything else’ partitions, and will contain any rows contained in the original partition that are not contained in the any of the other new partitions. For example, let us split the Y_2007 partition back into 4 quarterly partitions – SQL alter table orders split partition Y_2007 into (PARTITION Q1_2007 VALUES LESS THAN (TO_DATE('01-APR-2007','DD-MON-YYYY')), PARTITION Q2_2007 VALUES LESS THAN (TO_DATE('01-JUL-2007','DD-MON-YYYY')), PARTITION Q3_2007 VALUES LESS THAN (TO_DATE('01-OCT-2007','DD-MON-YYYY')), PARTITION Q4_2007); Now look at DBA_TAB_PARTITIONS to get details of the new partitions – TABLE_NAME PARTITION_NAME POSITION HIGH_VALUE -------------- --------------- -------- ------------------------- ORDERS Q1_2007 1 TIMESTAMP' 2007-04-01 00:00:00' ORDER_ITEMS Q1_2007 1 ORDERS Q2_2007 2 TIMESTAMP' 2007-07-01 00:00:00' ORDER_ITEMS Q2_2007 2 ORDERS Q3_2007 3 TIMESTAMP' 2007-10-01 00:00:00' ORDER_ITEMS Q3_2007 3 ORDERS Q4_2007 4 TIMESTAMP' 2008-01-01 00:00:00' ORDER_ITEMS Q4_2007 4 ORDERS Q4_2008 5 TIMESTAMP' 2009-01-01 00:00:00' ORDER_ITEMS Q4_2008 5 Partition Q4_2007 has a high value equal to the high value of the original Y_2007 partition, and so has inherited its upper boundary from the partition that was split. As for a list partitioning example let look at the following another table, SALES_PAR_LIST, which has 2 partitions, Americas and Europe and a partitioning key of country_name. SQL select table_name,partition_name, high_value from dba_tab_partitions where table_owner='SH' and table_name = 'SALES_PAR_LIST'; TABLE_NAME PARTITION_NAME HIGH_VALUE -------------- --------------- ----------------------------- SALES_PAR_LIST AMERICAS 'Argentina', 'Canada', 'Peru', 'USA', 'Honduras', 'Brazil', 'Nicaragua' SALES_PAR_LIST EUROPE 'France', 'Spain', 'Ireland', 'Germany', 'Belgium', 'Portugal', 'Denmark' Now split the Americas partition into 3 partitions – SQL alter table sales_par_list split partition americas into (partition south_america values ('Argentina','Peru','Brazil'), partition north_america values('Canada','USA'), partition central_america); Table altered. Note that no list of values was given for the ‘Central America’ partition. However it should have inherited any values in the original ‘Americas’ partition that were not assigned to either the ‘North America’ or ‘South America’ partitions. We can confirm this by looking at the DBA_TAB_PARTITIONS view. SQL select table_name,partition_name, high_value from dba_tab_partitions where table_owner='SH' and table_name = 'SALES_PAR_LIST'; TABLE_NAME PARTITION_NAME HIGH_VALUE --------------- --------------- -------------------------------- SALES_PAR_LIST SOUTH_AMERICA 'Argentina', 'Peru', 'Brazil' SALES_PAR_LIST NORTH_AMERICA 'Canada', 'USA' SALES_PAR_LIST CENTRAL_AMERICA 'Honduras', 'Nicaragua' SALES_PAR_LIST EUROPE 'France', 'Spain', 'Ireland', 'Germany', 'Belgium', 'Portugal', 'Denmark' In conclusion, I hope that DBA’s whose work involves maintaining partitions will find the operations a bit more straight forward to carry out once they have upgraded to Oracle Database 12c. Gwen Lazenby is a Principal Training Consultant at Oracle. She is part of Oracle University's Core Technology delivery team based in the UK, teaching Database Administration and Linux courses. Her specialist topics include using Oracle Partitioning and Parallelism in Data Warehouse environments, as well as Oracle Spatial and RMAN.

    Read the article

  • Beware of const members

    - by nmarun
    I happened to learn a new thing about const today and how one needs to be careful with its usage. Let’s say I have a third-party assembly ‘ConstVsReadonlyLib’ with a class named ConstSideEffect.cs: 1: public class ConstSideEffect 2: { 3: public static readonly int StartValue = 10; 4: public const int EndValue = 20; 5: } In my project, I reference the above assembly as follows: 1: static void Main(string[] args) 2: { 3: for (int i = ConstSideEffect.StartValue; i < ConstSideEffect.EndValue; i++) 4: { 5: Console.WriteLine(i); 6: } 7: Console.ReadLine(); 8: } You’ll see values 10 through 19 as expected. Now, let’s say I receive a new version of the ConstVsReadonlyLib. 1: public class ConstSideEffect 2: { 3: public static readonly int StartValue = 5; 4: public const int EndValue = 30; 5: } If I just drop this new assembly in the bin folder and run the application, without rebuilding my console application, my thinking was that the output would be from 5 to 29. Of course I was wrong… if not you’d not be reading this blog. The actual output is from 5 through 19. The reason is due to the behavior of const and readonly members. To begin with, const is the compile-time constant and readonly is a runtime constant. Next, when you compile the code, a compile-time constant member is replaced with the value of the constant in the code. But, the IL generated when you reference a read-only constant, references the readonly variable, not its value. So, the IL version of the Main method, after compilation actually looks something like: 1: static void Main(string[] args) 2: { 3: for (int i = ConstSideEffect.StartValue; i < 20; i++) 4: { 5: Console.WriteLine(i); 6: } 7: Console.ReadLine(); 8: } I’m no expert with this IL thingi, but when I look at the disassembled code of the exe file (using IL Disassembler), I see the following: I see our readonly member still being referenced by the variable name (ConstVsReadonlyLib.ConstSideEffect::StartValue) in line 0001. Then there’s the Console.WriteLine in line 000b and finally, see the value of 20 in line 0017. This, I’m pretty sure is our const member being replaced by its value which marks the upper bound of the ‘for’ loop. Now you know why the output was from 5 through 19. This definitely is a side-effect of having const members and one needs to be aware of it. While we’re here, I’d like to add a few other points about const and readonly members: const is slightly faster, but is less flexible readonly cannot be declared within a method scope const can be used only on primitive types (numbers and strings) Just wanted to share this before going to bed!

    Read the article

  • How to get GameElements (RigidBody) size in Unity

    - by Shivan Dragon
    I've made a prefab consisting of a Cube which I've first scaled to more resemble a brick. There's also a Rigidbody added to the cube (in the prefab). Now I want to use that prefab in a c# script to make a wall out of multiple bricks. My question is, how can I access the dimensions of my brick (width, height, the z dimension size) so that in my script I can make bricks which are placed one next to the other (and then one on top of the other)? I've looked at the documentation for GameObject and Rigidbody but I can't find anything helpful. Just for refference, my script so far is: public GameObject brick; void Start () { Instantiate(this.brick, new Vector3(0.01326297f, -30.07855f, 100f), Quaternion.identity); // int brickWidth = this.brick.????; }

    Read the article

  • GLSL vertex shaders with movements vs vertex off the screen

    - by user827992
    If i have a vertex shader that manage some movements and variations about the position of some vertex in my OpenGL context, OpenGL is smart enough to just run this shader on only the vertex visible on the screen? This part of the OpenGL programmable pipeline is not clear to me because all the sources are not really really clear about this, they talk about fragments and pixels and I get that, but what about vertex shaders? If you need a reference i'm reading from this right now and this online book has a couple of examples about this.

    Read the article

  • SQL SERVER List All the DMV and DMF on Server

    “How many DMVs and DVFs are there in SQL Server 2008?” – this question was asked to me in one of the recent SQL Server Trainings. Answer is very simple: SELECT name, type, type_desc FROM sys.system_objects WHERE name LIKE 'dm_%' ORDERBY name Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Tab Sweep: Primefaces3, @DataSourceDefinition, JPA Extensions, EclipseLink, Typed Query, Ajax, ...

    - by arungupta
    Recent Tips and News on Java, Java EE 6, GlassFish & more : • JSF2 + Primefaces3 + EJB3 & JPA2 Integration Project (@henk53) • The state of @DataSourceDefinition in Java EE (@henk53) • Java Persistence API (JPA) Extensions Reference for EclipseLink (EclipseLink) • JavaFX 2.2 Pie Chart with JPA 2.0 (John Yeary) • Typed Query RESTful Service Example (John Yeary) • How to set environment variables in GlassFish admin console (Jelastic) • Architect Enterprise Applications with Java EE (Oracle University) • Glassfish – Basic authentication (Marco Ghisellini) • Solving GlassFish 3.1/JSF PWC4011 warning (Rafael Nadal) • PrimeFaces AJAX Enabled (John Yeary)

    Read the article

  • How to handle gender and sexual orientation on a form with select boxes more inclusively?

    - by Drew
    The existing question and the answers for it are not satisfying when one wants to be more inclusive. Gender as Male, Female or No Answer works for some sites, but not others. Taking the view that Gender is not the same as Sex (assigned at birth based on bodily characteristics). Would it be more inclusive to include these two options: Transgender Male, Transgender Female? So instead would more inclusive options be the following? Gender Identity Male Female Transgender Male Transgender Female No Answer Sexual Orientation Straight Lesbian Gay Bisexual Asexual No Answer Gender Expression could also be included, but I think most people would find that too confusing (defined on GLAAD's website linked below) I'm going off what I've read on GLAAD's Transgender Glossary of Terms. Thanks!

    Read the article

  • SCORM and the Learning Management System (LMS)

    What actually is SCORM? SCORM, Shareable Content Object Reference Model, is a standard for web-based e-learning that has been developed to define communication between client-side content and a runti... [Author: Stuart Campbell - Computers and Internet - October 05, 2009]

    Read the article

  • CUBEMEMBER and CUBEVALUE stop working after #PowerPivot upgrade to #Excel 2013

    - by Marco Russo (SQLBI)
    I found an issue upgrading an Excel workbook containing PowerPivot data from Excel 2010 to Excel 2013. All CUBEMEMBER and CUBEVALUE functions point to a cube name that has been changed between the two version – you have to no longer reference the PowerPivot Data name, replacing it with ThisWorkbookDataModel instead. I wrote an article describing the change that you have to manually make to these Excel formulas in this article on SQLBI web site.

    Read the article

  • Sony PSM sdk and 2d Game engine

    - by Notbad
    I have started with Sony PSM sdk this week. I'm interested to create a little 2D game and have been reading through the web about a so called "2D game engine" integrated in psm. Some information I read suggested that it was going to be added on january 2012, but I have been going through the documentation and haven't been able to find any reference to it. Does anybody know if they finally introduced the 2D game engien for psm? Thanks in advance.

    Read the article

  • CoolCommands for Visual Studio 2010

    - by ChrisD
    Gaston Milano has just informed me that he has a new version of CoolCommands for Visual Studio 2010.  In addition to all the existing commands, the new release, now called CoolX,  supports Context Explorer shell commands, support for multiple monitors and new features to help tame the Project Reference beast. Check out all the info including a download, available here –>http://visualstudiogallery.msdn.microsoft.com/en-us/53fe63d5-780d-409b-afc3-10d05906e0a6 I’m also hosting a version for download here. Thanks Gaston!

    Read the article

  • Scripts won't affect clones - Unity3d

    - by user3666251
    I made a script which swaps two game objects on click.But the script won't work because the objects are actualy clones of the original prefab. This is the script (UnityScript): #pragma strict var object1 : GameObject; var object2 : GameObject; function OnMouseDown () { Instantiate(object2,object1.transform.position,object1.transform.rotation); Destroy(object1); } I use this script to create other game objects (clones)[c#] : using UnityEngine; using System.Collections; public class Spawner : MonoBehaviour { public GameObject[] obj; public float spawnMin = 1f; public float spawnMax = 2f; // Use this for initialization void Start () { Spawn (); } void Spawn() { Instantiate(obj[Random.Range(0, obj.GetLength(0))],transform.position, Quaternion.identity); Invoke ("Spawn", Random.Range (spawnMin, spawnMax)); } } The objects get renamed to NAME (Clone). What I wanna do is make the script affect clones too.So they will swap when I click on them.

    Read the article

  • Unity: Assigning a key to perform an action in the inspector

    - by Marc Pilgaard
    I am trying to write a simple piece of code in JavaScript where a button toggles the activation of a shield, by dragging a prefab with Resources.load("ActivateShieldPreFab") and destroying it again (Haven't implemented that yet). I wish to assign this button through the inspector, so I have created a string variable which appears as intended in the inspector. Though it doesn't seem to register the inspector input, even though I changed the value through the inspector. It only provides the error: "Input Key named: is unknown" When the button name is assigned within the code, there is no issues. Code as follows: var ShieldOn = false; var stringbutton : String; function Start(){ } function Update () { if(Input.GetKey(stringbutton) && ShieldOn != true) { Instantiate(Resources.load("ActivateShieldPreFab"), Vector3 (0, 0, 0), Quaternion.identity); ShieldOn = true; } }

    Read the article

  • java webservice requires usernametoken over basichttpbinding (3 replies)

    I need to call a Java webservice. I can add a service reference without problems, and I get Intellisense in Visual Studio. However, when I try to call a service method I get an error message saying &quot;Missing (user) Security Information&quot;. I n my code I try to set usercredentials: testWS.WarrantyClaimServiceClient svc new TestClient.testWS.WarrantyClaimServiceClient(); svc.ClientCredentials.UserName....

    Read the article

  • tunnel effect cocos2d

    - by samfisher
    I am looking to create a similar tunnel effect in COCOS2D (iOS). Could anyone suggest any pointers? ref Video 1 ref Video 2 Till now I have tried with several ring shape sprites with decreasing scale and positioned center to a same point and keeping Z decreasing as well for each smaller sprite. With that, animating it with CCScaleTo and changing the size to 2.0 with animation duration but it does not come anyway near to the tunnel effect shown in the reference. Thanks, sam

    Read the article

< Previous Page | 153 154 155 156 157 158 159 160 161 162 163 164  | Next Page >