You are viewing anton_burtsev

Aug. 5th, 2013

(no subject)

  • 1 387 р.
  • 2 414 р.
  • 3 207 р.
  • 6 379 р.

Jun. 13th, 2010

XsltDb - DotNetNuke Multiportal Integration Platform

Hello everybody!

My new idea - XsltDb DotNetNuke module - the DotNetNuke module that allow to create what you need rapidly by just integrating existing technologies. I have already released a lot of versions an can say now that the module is successful. Now I use the module in commersial projects, such as Pressa-Online and promote the module in internet.

For the moment XsltDb allows simultaneous usage of the following technologies
  • DotNetNuke
  • XSLT
  • SQL
  • ASP.NET controls
  • Telerik controls
  • jQuery
  • Act as XML datasource for other sites
  • Access remote XML documents.
Live Demo for the XsltDb DotNetNuke module can be found here: http://xsltdb.comMany more demos are coming soon.

You are welcome to download installation package and request features you need. I'm very interested in co-developers as it gives me right direction of moving forward.

Upcoming features are:
  • to support localization mechanism,
  • to support ISearchable to provide adding any content to DotNetNuke search index,
  • to support IUpgradable to provide more reliable upgrade process,
  • to support module settings to let developer write one configuration and setup different modules by using configuration options.
  • to provide configuration packaging to allow configuration distribution and one click installation.
  • to add SkinObject creation feature when XsltDb configuration acts as skin object.
You need more features ? Let me know about them!

bye.

Jun. 25th, 2009

PostgreSQL Window Functions.


Window functions is the greatest treasure of SQL 2003/2008. And now we have it implemented in PostgreSQL 8.4. I have prepared an article at wiki.postgresql.org (PDF). Find the following task discussed in it.
  • Running Totals
  • Gap Search
  • Stock Control With FIFO Cosing Methods
  • Personnel Motivation Table
  • Derivation And Integrating

See also this presentation

bye.

PostgreSQL. Common Table Expressions. Trees.


In PostgreSQL 8.4 the great long-expected feature was introduced. This is Common Table Expressions - CTE. The main benefit of CTE is recursive queries (as all other applications of CTE are possible with subqueries). Using this feature we can access trees with compact easily-readable queries. Also have a look at this presentation. Nice.

Now I show you how one can create whole tree with single SQL statement. This is common task to create a test database, especially for load and volume testing. And then we query the whole subtree also with single SQL statement.

First, create a tree table and load records into it.

create table people
(
      ID bigint primary key default nextval('people_seq'),
      ManagerID bigint,
      PersonName varchar(100)
);

insert into people(ID, ManagerID, PersonName)
with recursive people_tree
      (ID, ManagerID, PersonName, NodeLevel) as
(
      -- First, insert root node
      select
            nextval('people_seq') ID,
            null::bigint ManagerID,
            'ROOT' PersonName,
            0 NodeLevel
     
      union all

      -- Second, generate children recursively
      select
            nextval('people_seq') ID,
            d.ID ManagerID,
            d.PersonName || '.' || s PersonName,
            d.NodeLevel + 1 NodeLevel
      from
            people_tree d,
            generate_series(1,2) s -- Create 2 descendants
      where
            d.NodeLevel < 3 -- Maximum level of depth
)
select ID, ManagerID, PersonName from people_tree;

IDManagerIDPersonName
1 ROOT
21ROOT.1
31ROOT.2
42ROOT.1.1
52ROOT.1.2
63ROOT.2.1
73ROOT.2.2
84ROOT.1.1.1
94ROOT.1.1.2
105ROOT.1.2.1
115ROOT.1.2.2
126ROOT.2.1.1
136ROOT.2.1.2
147ROOT.2.2.1
157ROOT.2.2.2


The amazing feature that we create whole tree with the single query. And we also can select a subrtee with single query, say whole subtree starting from "ROOT.1":

with recursive people_tree
      (ID, ManagerID, PersonName) as
(
      select * from people
      where PersonName = 'ROOT.1'

      union all

      select p.* from people p
      join people_tree d on d.ID = p.ManagerID
)
select * from people_tree;

IDManagerIDPersonName
21ROOT.1
42ROOT.1.1
52ROOT.1.2
84ROOT.1.1.1
94ROOT.1.1.2
105ROOT.1.2.1
115ROOT.1.2.2
 

buy.

Jun. 19th, 2009

SQL. Fast GUID List Generator.

If you want your database to be fast with huge amount of data, you should test it with large test database. To create test database you may need many guids for row ids. If you want to create a list of guids and put them into a SQL table you may write the following:

create table ids(id uniqueidentifier);

declare @guidCount int = 10000;
declare @n int = 0;
while @n < @guidCount
begin
   insert ids values(NEWID());
   set @n = @n + 1;
end

This code creates 10 000 guids.
If you need more guids it can be very slow. You then modify the code as follows

create table ids(id uniqueidentifier);

declare @ids table(id uniqueidentifier default NEWID());
declare @guidCount int = 100000;
declare @n int = 0;
while @n < @guidCount
begin
   insert @ids default values;
   set @n = @n + 1;
end

insert ids with(tablockx)
select * from @ids;

This code creates 100 000 guids with the same period of time.
If you need even more guids the second aproach also can be slow and memory consumptive. But what if we won’t generate guids in memory? Instead, we just generate counter and then use it to generate guids:

create table ids(id uniqueidentifier);

declare @ids table(id int identity)
declare @guidCount int = 1000000;
declare @n int = 0;
while @n < SQRT(@guidCount)
begin
   insert @ids default values;
   set @n = @n + 1;
end

insert ids with(tablockx)
select NEWID() from @ids i1, @ids i2;

This code creates 1 000 000 guids also with the same period of time as the first and second ones!
I use this pattern to generate huge test databases. If you need volume testing – this pattern is for you.

bye.
See you on codeproject

Tags:

Jun. 17th, 2009

SQL. Running Totals.

Running totals... Accountants like them so much. But there is no way in MS SQL to make efficient query that calculate running total. Why? What is running total? It is just previous running total plus current value. But MS SQL syntax does not give an access to previous rows. I search the Internet for the running totals. Every link I click I saw 2 methods to calculate running totals
  • join or subquery the table. Here we have n2 operations
  • use cursor and temporary table. Here we have to insert into temporary table and then to join on the basis query.
So, both ways are not so fast as I need to. But one can feel that if to use procedural languages it may be done very fast.
Yes! .NET does help us since Microsoft added seamless support of .NET into the SQL Server!
The idea is to create a .NET function that can remember values of previous row and force desired order for the running total in SQL query.
First we need a function to keep previous running total. As far as SQL Server is multithread environment we should guaranty thread-safety and keep each running total in thread-related variable. It is good idea to use CallContext class for this purpose.
 
using System;
using System.Runtime.Remoting.Messaging;

namespace RunningTotal
{
    public static class RunningTotalUtils
    {
        public static decimal RunningTotal(decimal currentValue)
        {
            object _lastTotal = CallContext.GetData("runningTotal");
            decimal lastTotal = _lastTotal == null ?
                                                                 0 : Convert.ToDecimal(_lastTotal);
            lastTotal += currentValue;
            CallContext.SetData("runningTotal", lastTotal);
            return lastTotal;
        }
    }
}
 
Now it’s time to create a T-SQL wrapper for the function:
 
CREATE ASSEMBLY RunningTotal
FROM 'C:\RunningTotal.dll'
WITH PERMISSION_SET = UNSAFE;
GO

CREATE FUNCTION RunningTotal (@amount decimal(18,2))
RETURNS decimal(18,2)
AS EXTERNAL NAME
   RunningTotal.[RunningTotal.RunningTotalUtils].RunningTotal;
GO
 
And... the fastest ever running totals come with the following SQL query:
 
select *, dbo.RunningTotal(Amount) Total from Transactions;
 
This is correct for the following table
 
create table Transactions
(
   ID int identity primary key,
   Amount decimal(18,2)
);
 
It seems no one can do it faster.

Bye.
See you on codeproject
 

Tags: ,

C#. Static cache and multithreading.

A few months ago I had to optimize web portal (classified ad). The portal was built on asp.net 1.1. It seemed that portal had 2 main problems:
  • It failed when it was more than 1 user (Collection was modified, etc.)
  • It returned different data for the same page when I setup more than 1 IIS worker process
The reason is: static cache and other static objects. As far as static objects are the same for all threads and different for every process, they are the reason for both troubles. They conflicted in multithread operations and prevented querying SQL Server “thinking” that data has already been read.

There was not time and resource to redesign and recode the portal. So I needed something clear and powerful.

First, I tried classical methods. I Try to synchronize critical code blocks. But if you have static collection of collections this is not really possible. All nested collections are candidates to fail in any part of code.

Next I tried not to enumerate original lists. I always create copy before enumeration to prevent its modification before enumeration finishes. However it leaded to a set of new troubles and I had to get everything back.

And... a fresh idea entered my head. Now it seems quite obvious, but it took a long period to appear...

Look, If we have static cache in the following form

static Hashtable Objects = new Hashatble();

We do not need to synchronize it. It is enough to make for each thread it’s own cache. And we easily can do it with CallContext. First, Create a thred-safe “static” object store:

public sealed class CallObjects
{
   public enum Names
   {
      STAT_OBJ_1,
      STAT_OBJ_2,
      ......... 
   }

   public static object Get(Names name)
   { 
      return CallContext.GetData(name.ToString());
   }

   public static void Set(Names name, object obj)
   {
      if ( obj == null )
         CallContext.FreeNamedDataSlot(name.ToString()); 
      else
         CallContext.SetData(name.ToString(), obj);
   }
}

Now we can use static property that returns thread-dependent Hashtable

Hashtable Objects
{
   get {
      Hashtable objects = 
         (Hashtable) CallObjects.Get(CallObjects.Names.STAT_OBJ_1); 
      If ( objects == null ) { 
         objects = new Hashtable();
         CallObjects.Set(CallObjects.Names.STAT_OBJ_1, objects);
      } 
      return objects;
   }
}


This simple modification saved life to the project. And that was great save of money and time.

Bye.
See you on codeproject
 

SQL. How to find holes in sequences.

Did you ever been asked to find all holes in document numbers (docs are sequentially numbered)? I had to do this search twice. Once for auditing purposes: holes may mean some sort of forgery. And now customers want to reuse numbers of deleted docs. And how do you think holes can be found using SQL ?  Assume we have:

create table tbl (id int identity primary key);

So, there is nothing easier than intersect tab to itself with incremented id.

select id+1 from tbl
except
select id from tbl;

This query returns all holes. The result is list of missed id ranges begins. Ends of missed id ranges can be got with similar query.

select id-1 from tbl
except
select id from tbl;

I also need the function for the minimum free doc number. This is it:

select top 1 * from
(
   select id+1 as id from tbl
   except
   select id from tbl
) t;


bye.
See you on codeproject
Tags:

August 2013

S M T W T F S
    123
45678910
11121314151617
18192021222324
25262728293031

Syndicate

RSS Atom
Powered by LiveJournal.com