开发者

Custom sorting algorithm needed

开发者 https://www.devze.com 2023-02-22 15:34 出处:网络
I have a need for an unusual sorting algorithm which would be massively useful to a lot of people, but I would prefer to leave the specific application vague as I have not found particularly good solu

I have a need for an unusual sorting algorithm which would be massively useful to a lot of people, but I would prefer to leave the specific application vague as I have not found particularly good solutions in my research and was wondering if folks here could bring new ideas to the table. This is a real-world sort, so it has some restrictions which are different from many algorithms. Here are the requirements.

  • The lists to be sorted are of no uniform number of elements.
  • The values by which elements are sorted are not directly observable.
  • The comparison operation of two elements is expensive.
  • You may run as many comparison operations as you wish in parallel as you wish with no increase in expense.
  • Each element may only participate in one comparison operation at a time.
  • The result of a comparison operation only gives greater than, less than, or equal.
  • There is a probability that the comparison operation results in an incorrect value which is dynamic given the difference in the hidden values of the elements.
  • We have no indication when the comparison gives an incorrect value.
  • We may assume that the dynamic error rate of comparison is normally distributed.

    Elements might intermittently be unavailable for comparison.

So, shot in the dark, hoping for somebody with an itch. The general gist is that you want to find the best way to set up a set of parallel comparisons to reveal as much information about the proper sort order as possible. A good answer 开发者_开发问答would be able to describe the probability of error after n groups of actions. I'm sure some folks will be able to figure out what is being sorted based on this information, but for those who can't, believe me, there are many, many people who would benefit from this algorithm.


I'd look at comparator networks. One of the assumptions is the ability of doing multiple comparisons in parallel, and the usual goal is to minimize number of "layers" of comparisons. A so-called AKS network can achieve O(log n) time this way.

But they work with an assumption of all comparisons done correctly. I guess that handling errors could be done afterwards, by making additional layer of comparators to compare every two consecutive items after main sorting...

Starting point: Wikipedia

Anyway, this looks more like a scientific research topic.

0

精彩评论

暂无评论...
验证码 换一张
取 消